Spending More while Paying Less

If there truly are dramatic economies of scale via the cloud that in turn drive the lowest possible unit costs, then the cloud will no doubt save money, as measured in the cost per unit of computation or storage, after accounting for search costs, switching costs, worker retraining, and the like.

Moreover, in Chapter 11, we argued that you could save money even while paying more for a unit of computing delivered via cloud, if there was sufficient resource demand variability. After all, the total value proposition of the cloud is not just that you pay a given amount for a particular set of resources or services but that you don’t pay anything when you aren’t using them.

By definition, then, the cloud—whether used singly or in a hybrid—can’t help but reduce the average cost of a unit of IT. But if the cost comes down, doesn’t that mean that IT departments will spend less? This question is best answered by analogy.

The reduction in cost from the original cell phones to today’s smartphones, by a factor 100 or so—more if normalized for increased functionality and performance—was accompanied by an increase in mobile services from an original market size of zero to today’s $1 trillion-plus market. The same has happened with energy: As its cost has dropped and ubiquity grown, its usage has increased. This effect is called Jevons’ Paradox, but it doesn’t seem to be much of a paradox.37 Many goods exhibit price elasticity of demand: Demand increases as price ...

Get Cloudonomics: The Business Value of Cloud Computing, + Website now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.