When banks introduced ATMs in the 60s, they thought of them as an efficiency measure, i.e. cutting costs. The math was supposed to work like this: if people withdraw money from the counter X times a day, consuming M minutes of the teller at C cost per minute, then I can save up to: X * (M*C - TC), being TC the unitary cost per transaction executed in the ATM. Pretty straight forward (if we disregard the complexities of fixed costs allocation in the case).
They were wrong.
The problem was X: number of times a day people withdraw money from the bank. The introduction of ATMs pushed that number several orders of magnitude beyond the original assumption: that X is equal in both counters and ATMs. Suddenly, people were withdrawing several times a day, many times a week, just because they now could do it so conveniently. This is the manifestation of a natural law that is i) very simple, ii) extremely powerful and iii) often oversighted:
Reduce friction -> Increase frequency
If you want something to happen more often, then reduce the friction associated with it, in any of its many forms: physical effort, economic cost, psychological burden...
Some companies get this right: Amazon's 1-Click Purchase, for instance. It doesn't get much simpler than that, right? Can you guess how much revenue is associated with the frictionless (and therefore more frequent) process of purchasing at Amazon?
Some companies get this wrong: when Microsoft introduced Windows Vista they offered up to 6 versions of it: Home Basic, Home Premium, Business, Small Business Edition, Enterprise, Ultimate. Friction came as a result of having to understand the differences between them in order to take a decision. Do you think this offering was targeted for a mass consumer maket that would jump into an impulse purchase?
(Interestingly enough, a company that has mastered this art, Apple, has screwed it big time in its last big launch: Apple Music. If you try this service please note the convoluted and confusing onboarding sequence that you have to follow in order to sign up).
This concept may be simple and may be powerful, but it's far from easy to apply, because it's actually counterintuitive. To give you some perspective, be aware that IT developments over the last decades gave us so many possibilities that we fell naturally under the following paradigm:
Value is associated with functionality.
A product that offers us the possibility to do many things is powerful, therefore it is valuable. The race for more features started, and it felt natural competition: it is not easy to cram more and more stuff into something, and do it right. We ended up with Excel.
Then a combination of two factors changed the paradigm. First, the entry barriers that were created against competition (the complexity and depth of the product) was also creating a barrier for new customers to jump in (friction), therefore creating an ever growing green field for others to capture: the casual users. Second, the mass adoption of other computing alternatives, mobile, was forcing developers to reduce the scope of features and focus on usability. There was a new breed of solutions out there: simple, useful and inviting, and we were adopting them by the millions.
So now the brains of many smart guys are working under a new paradigm:
Value is associated with convenience. (Because less friction means more use)
And we've learned that convenience is a hard ground to gain, because it only comes as a result of reduction, of simplification. And simplifying means taking hard decisions: what feature will you cut off to gain 80% convenience at a cost of 20% functionality? There is never a right answer for that, because it highly depends on how each one of us uses a product. What is right for me may not be right for you. Segmentation, therefore, rules the world now. Take the App Store and check some of the most interesting playing grounds of this era: to do, calendar, or note taking apps. You will find thousands of options and, more interestingly, no clear winner. That's the result of those driving forces: convenience and segmentation.
Interesting as this is, perhaps more enticing is to wonder whether we have exhausted this paradigm and a new one will emerge, full of opportunities for those who spot it first. Once we've learned that reducing friction increases frequency, we probably have to focus on what problems need to be solved as a result of frecuency being increased, and perhaps a new model will result out of that.
Food for thought for those who venture to make things and bet on them. Cheers and see you around next week!