NOTE: this is Part 1 of a multi-part series.
Today’s internet is one of extremes. Shock drives attention, attention drives eye-balls, and eye-balls sell ads. We all know this, but the most shocking thing I read this week was not the typical “10 crazy things you didn’t know about Game of Thrones”. It was about the very mundane topic of mortgages. That’s right, mortgages.
Through this article on “How Stuff Works”, I learned, with great surprise, that there was once a time (barely a century ago) where houses were purchased in whole—with cash. Of course, in retrospect this makes a lot of sense. Housing wasn’t always as expensive as San Francisco in 2019, and most financial instruments post-date the concept of property ownership. However, as a millennial who grew up during the 2008 financial crisis, with parents who owned multiple mortgages and friends who are about to onboard themselves, the existence of amortization for high fixed-cost goods has always been something I’ve taken for granted.
Classic millennial, falsely believing that something that exists today is something that has existed always.
Sarcasm aside, this realization has led me to mull over the other things that I (and we collectively) take for granted - specifically things that serve similar functions as mortgages. So, what is that function?
At its core, mortgages allow people to spread a one-time fixed cost payment into multiple payments of smaller amounts. Functionally, mortgages are amortization vehicles that have lowered the minimum requirements to own property.
amortization (/’amôrˌtīˈzāSHən’/)
the action or process of reducing or paying off a debt with regular payments.
Can this concept—amortization—be applied more broadly and beyond property ownership? Can amortization redefine what it means to “own something” altogether? I believe it can. And going one step further, I would argue that the most important companies in Silicon Valley today are simply providing amortized services that aim to replace ownership.
To understand how, let’s take a walk down memory lane.
AWS
Before cloud hosting (AWS, Digital Ocean, Azure, etc.), internet companies needed to invest in purchasing and maintaining their own servers. Some did so via shared data centers, others put physical hardware in their garages (the earliest version of “on-premise”). Stories of Mark Zuckerberg driving to data-centers in the middle of the night to reboot downed Facebook servers have become startup folklore.
In other words, every aspiring internet company needed to a) have sufficient funds to buy a server b) have a place to host them and c) carry the hardware expertise to know how to maintain them. Lack any one of the three and don’t bother - can you imagine that today? If there was such a thing as a startup “Constitution”, affordable cloud-based servers that abstract away hardware from software businesses would be the first amendment - akin to “freedom of speech”. Amazon Web Services was started to solve this problem in 2002—less than 20 years ago.
AWS has converted what was previously an inflexible fixed cost into a low variable cost that scales proportionally with usage. Startups no longer need to spend tens of thousands on servers - a non-differentiating infrastructure investment - that may never be used.
Today, AWS generates nearly $30B in annual revenue. Its economic impact on our GDP is likely many times greater (server costs for most businesses fall under 10% of operating expenses). As one of the few companies in the history of the world that has a reasonable claim to being essential (for example, I would argue Facebook is non-essential) to the economy of its era, AWS is in a class of its own. And what is AWS?
It is simply an amortized service that replaces the need to own and maintain servers.