Consider the following story:
This is a disturbing—and all too common—story. This story reflects many companies’ mindsets about cloud migration. If you take an application and move it as-is to the cloud (“lift-and-shift,” in cloud terminology), then, naturally, it will take as many server resources to run it in the cloud as it takes to run it in an on-premises data center. Using this analysis, the CEO was right … at least from a high-level perspective.
The Real Cloud Finance Story
But this story does not consider the chief advantages of the cloud from a financial perspective: The cloud enables the use of dynamic resource allocation.
When you use the cloud, the real benefit is that you can dynamically adjust the number of necessary resources and reduce resource utilization during downtimes. You don’t need to have unused resources simply lying around. Additionally, given that you can increase resources as easily as you can decrease them, you don’t need to have excess resources available to handle unexpected or planned usage spikes. You simply add the required resources when you need them. This dynamic resource variability means an application—even one not optimized for the cloud—typically requires fewer resources when running in the cloud than it requires on-premises. Now, this isn’t always the case. The more you optimize the application for the cloud (such as moving to a cloud-native, dynamic, microservices-based architecture), the greater the resource savings in the cloud and the lower the cost.
So, it is true that, at the highest level, a cloud-based server typically costs more than an equivalent on-premises server. But, by the time you consider the fully loaded costs and understand that you are allocating cloud resources by the minute, not leasing by the month or year, the savings in cloud utilization can be dramatic.
This isn’t a new phenomenon. It’s one of the core advantages that has driven the success of the cloud.
Yet, there is a second, largely forgotten, financial advantage.
The Color of Money
When you build out an on-premises data center, you typically build it by purchasing (or leasing) the equipment and real estate necessary to build the data center. It’s a large, mostly upfront expenditure. You build the data center to meet not your current demands but your expected future demands. In short, you need to invest capital in building your infrastructure. A capital expenditure is one of the most expensive uses of money a company can make because it is investing in a prediction of future needs rather than an immediate current need.
But when you put your application in the cloud, you rent only the resources you need at that moment rather than for future needs. In theory, your application needs resources that are roughly in proportion to the number and size of your active customer base, which is (at least loosely) tied to your revenue. This means you rarely need to use capital expenditure to fund your application infrastructure and, instead, use COGS—the cost of goods sold. For most companies, it’s substantially easier to raise money to cover COGS expenses than it is to raise money to cover a capital expense. This is because, typically, COGS expenses are tied directly to revenue, while capital expenses are more of a speculative investment.
So, not only do you need less money for fewer resources, the money you spend is easier and less expensive to acquire—hence, the true cost savings of the cloud.
Of course, whether this is an advantage for you and your company depends on your company’s financial situation. You should discuss this with your CFO for details relevant to your company.
Yes, the Cloud is Less Expensive
In my years of experience, I’ve found it rare that using the cloud was more expensive than an on-premises solution. Most often, there are real savings when properly using a public cloud for your application infrastructure. When a real-world comparison is made and the benefits are properly calculated, rarely is the cloud more expensive than on-premises.