The Shift to a Cloud Cost Model

As I noted at the start of this chapter, you pay for resources in the cloud as you use them. For Amazon, that model is by the CPU-hour. For other clouds, such as GoGrid, it’s by the RAM hour. Let’s look at how you can anticipate costs using the example resource demands described earlier (two application servers from midnight until 9 a.m., eight from 9 a.m. until 5 p.m., and four from 5 p.m. until midnight).

Suppose your core infrastructure is:

$0.10/CPU-hour: one load balancer
$0.40/CPU-hour: two application servers
$0.80/CPU-hour: two database servers

Each day you would pay:

$2.40 + $44.00 + $38.40 = $84.80

Your annual hosting costs would come to $30,952.00—not including software licensing fees, cloud infrastructure management tools, or labor.

How to Approach Cost Comparisons

The best way to compare costs in the cloud to other models is to determine the total cost of ownership over the course of your hardware depreciation period. Depending on the organization, a hardware depreciation period is generally two or three years. To get an accurate picture of your total cost of ownership for a cloud environment, you must consider the following cost elements:

  • Estimated costs of virtual server usage over three years.

  • Estimated licensing fees to support virtual server usage over three years.

  • Estimated costs for cloud infrastructure management tools, if any, over three years.

  • Estimated labor costs for creating machine images, managing the infrastructure, and responding ...

Get Cloud Application Architectures now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.