When it comes to cloud strategy, companies rank “cutting costs” as their top priority for 2019, according to a recent Datamation survey. That’s not to say that they plan to cut back on cloud spending in general; in fact, those budgets are very much expected to grow. Rather, companies are looking for ways to reduce unnecessary costs and optimize cloud spend.
The major challenge with accomplishing this goal? Most organizations can’t predict or control their cloud costs with any sort of accuracy. DevOps teams, who are building the applications and infrastructure that ultimately drive cloud costs, have almost no visibility into how their day-to-day decisions affect the bottom line.
Today, we want to take a look at how organizations can empower DevOps to make better cloud cost decisions, and thus achieve the goal of optimizing cloud spend.
Cloud Cost Optimization: What it Means
First, let’s define the term “cloud cost optimization.”
When it comes to the cloud, the old adage, “You have to spend money to make money,” certainly applies. But it’s also true that a lot of wasted spend happens in the cloud. Much of that takes place because there is little to no visibility into how day-to-day infrastructure- and code-level changes impact cloud costs. For example, a developer might unthinkingly set up a system to record backup images on a nightly basis. No big deal, until you realize how much data that actually entails… But you probably won’t find out how much it costs in dollars and cents until it’s way too late.
Optimizing cloud costs means having visibility into how specific decisions will impact the bottom line in real time. That way, DevOps teams can weigh the pros and cons ahead of time and make the best cost-tuning decision for the health of the business as a whole. Additionally, the optimization process works best when decision-making is automated as much as possible, since these are highly complex and interrelated environments.
It’s important to note that these decisions don’t have to slow the business down or decrease the pace of innovation, especially when you bring automation into the equation. If you are able to optimize your cloud costs, you can spend the money saved on other things, like increasing engineering headcount or building a new strategic application.
Who Should Be in Charge of Cloud Cost Optimization?
We recently wrote about why cloud cost optimization shouldn't be finance's responsibility. Too frequently, finance will discover a cost spike on the organization’s cloud bill and ask DevOps leadership what happened.
DevOps leaders often don’t have direct answers to finance’s questions, and this can lead to friction and frustration. DevOps may ask the engineering product owner what’s happening within their application, but that person also often doesn’t understand what app-level choices have to do with financial outcomes. Cloud cost anomalies can result from any number of engineering decisions or innocent mistakes. The process of connecting DevOps decisions to cloud cost outcomes is painful for everyone when it’s done in a reactive manner.
Ultimately, finance shouldn’t be in charge of cloud cost optimization, because they aren’t the ones making cloud decisions. Instead, DevOps needs to take the reins here.
Understand DevOps’ Point of View
Now, it’s important to understand how DevOps teams think about cloud costs.
The operations side of the house, such as the cloud architect, is often responsible for cloud cost at some level. But they often have no confidence in their current cloud cost management solutions (if they even have any). They, of course, want to make operational decisions to drive better business outcomes, but they usually don’t have the data they need to do this. Or, they’re overwhelmed with the sheer amount of information contained in their monthly cloud bill, and can’t separate the relevant signals from the noise.
Developers, generally speaking, don’t want to focus on cost. Given the choice, they will always pick the fastest, most high-performance options. They fear that cost obsession will slow them down and prevent them from doing what they care about: building high-quality software. That said, developers love tools that help them do their jobs better. If they had access to easy-to-understand feedback on how certain choices would impact cloud costs, developers would happily make code-level decisions with the bottom line in mind.
In other words, DevOps is more than happy to take cost into account if they have access to the right data at the right time. Cloud cost can become a metric to help drive engineering decisions at your organization if you give DevOps the insight they require.
Putting the Right Data in the Right Hands at the Right Time
One more time for the cheap seats: DevOps teams need to have the right data at the right time if you want them to take on cloud cost optimization. The best way to accomplish this goal is to feed cloud cost data directly into tools DevOps teams already use, like Slack and OpsGenie. This way, developers can monitor their applications and fix potentially costly choices or mistakes in real time.
Rather than building in the cloud without concern for cost, DevOps teams who are empowered with cloud cost data can choose exactly the capacity they need and avoid waste. Over time, teams will build more efficient systems, and this will save money that can be used to drive innovation by investing in more headcount or new technology, like serverless. Ultimately, cloud cost optimization isn’t just good for the bottom line. It helps DevOps teams achieve their goal of building high-quality infrastructure and applications to propel the whole business forward.
To learn more about CloudZero’s cloud cost optimization capabilities, get started here.