Welcome to December’s edition of CloudZero’s Cloud Economics Pulse — your monthly read on how cloud spend is shifting across providers, services, and AI workloads.
No surprises here — November continued the quiet reshaping trend we’ve seen all year.
Compute softened, data layers grew, and AI/ML hit its highest share yet. AWS extended its lead, Azure and GCP nudged upward, and the emerging “AI layer” of providers continued to take shape.
The real story, though, is the compounding effect. AI is no longer a side project. It’s now becoming part of everyday cloud operations, and the cost footprint shows it. For teams feeling the pressure of rising AI spend (and you’re not alone, believe me), this month’s Pulse includes practical steps to get ahead of it, from pipeline hygiene to smarter dataset management.
Let’s dig into what changed this month and what it means for your FinOps strategy heading into 2026.
Related read: Your Cloud Economics Pulse For November 2025
How We’re Looking At Data (And Why It Matters)
For the Cloud Economics Pulse, we track monthly cloud spend trends using anonymized, aggregated data from CloudZero’s network.
- Cost by Provider and Cost by Service Category are shown as stacked charts, each illustrating how providers and service types contribute to total cloud spend over time. These are presented as percentages totaling 100% for each month.
- Cost of AI/ML measures the share of AI and machine learning technologies as a percentage of all cloud spend and is presented as a line chart to highlight trend acceleration. This is presented as both average and median % of total spend.
Together, these views show not just where cloud dollars go, but how spending patterns shift as new technologies — especially AI — reshape the cost landscape.
Main Highlights For November 2025
- AI/ML climbs again — now at its highest point yet. AI’s share of total cloud spend rose from 1.83% to 2.44%, with the median also climbing. The long-arc view shows clear, steady acceleration as AI shifts from experimentation to a standard part of production.
- Compute softens as data layers expand. Compute dropped below 50% of total spend, while Databases and Storage gained ground — reflecting a shift toward data infrastructure as AI workloads mature.
- AWS extends its lead while the ecosystem consolidates. AWS climbed to 69.8% of all provider spend, Azure and GCP nudged upward, and smaller categories normalized. AI- and data-centric vendors gained modest share, reinforcing the “Big Three + AI layer” pattern.
1. Cost By Provider
Here, we’re looking at how overall cloud spend is distributed across providers:
Cloud provider spend again shifted in November, but the more interesting story emerges when we look at both the month-over-month changes and the broader year-long trend put together.
AWS climbed from 68.8% in October to 69.8% in November, continuing its steady upward trajectory. AWS started the year at 67.2% and has gained share in most months since, despite a brief dip in October. November’s increase looks like a return to the year’s underlying pattern: AWS steadily absorbing more compute, data, and AI-oriented workloads.
Azure also rose, from 10.7% to 11.4%. That’s a meaningful bump considering that Azure’s share has generally drifted downward or sideways throughout 2025. November doesn’t necessarily reverse the overall trend (Azure was 12.6% in January), but it does suggest a mild stabilization after months of incremental softening.
GCP increased slightly, nudging from 6.5% to 6.6%, consistent with its remarkably stable year. GCP has spent most of 2025 between 6.5% and 7.1%, indicating steady demand for data and analytics workloads rather than major shifts in share.
Hyperscalers aside, the “Other” category draws attention — not because it dropped in November (from 4.1% to 2.5%) but because the October spike now looks like an anomaly. For most of 2025, “Other” has held between 2.0% and 3.6%, so the return to 2.5% in November likely resets the category to its typical range rather than signaling a meaningful market contraction.
The Big Three’s marketplace trends tell another story:
- AWS Marketplace increased again (2.4% → 2.7%), continuing a slow but steady rise that has been visible all year.
- Azure Marketplace collapsed back downward (0.7% → 0.1%), reverting to its long-standing pattern after an unusual October jump.
AI- and data-focused vendors remain small but continue inching upward. Snowflake, Databricks, and OpenAI all saw modest MoM gains, consistent with a year-long pattern of large enterprises allocating small slices of spend to specialized AI pipelines and data engineering platforms. Anthropic shows traction as well — still a small share today, but now consistently measurable since entering our dataset earlier in the fall.
Overall, November looks less like a dramatic shift and more like a return to the underlying 2025 equilibrium: AWS rising, Azure drifting downward but stabilizing, GCP steady, and the small-provider ecosystem slowly coalescing into a recognizable AI-and-data layer.
Key Takeaways
- AWS continues its upward glide: Up to 69.8%, reinforcing a year-long trend of steady share gains.
- Azure rebounds but not fully recovered: Now 11.4%, up from October but still well below January’s 12.6%.
- GCP holds its lane: Slight increase to 6.6%, consistent with its stable 2025 range.
- “Other” normalizes: November’s drop is simply a return to the category’s usual 2–3% band after a one-month spike.
- AI/data layer creeping upward: Snowflake, Databricks, OpenAI, and Anthropic see small but persistent share gains, consistent with slow maturation of AI spend distribution.
2. Cost By Service Category
Here, we’re looking at how overall spend is distributed across cloud services:
This month’s service mix reflects a few clear shifts. You may notice slight differences in distribution compared with last month’s report, and that’s because we refined several category mappings behind the scenes to better align with how customers actually structure their workloads.
The trends themselves, however, remain consistent.
Compute dipped again, moving from 50.5% in October to 49.3% in November. Throughout 2025, Compute has hovered near the 49–51% range, peaking midyear and softening again in the fall. This decline doesn’t reflect reduced compute usage but, rather, the growing weight of adjacent layers like Storage, Databases, and AI/ML.
Databases continued their steady pattern: after declining from 12.9% in March to roughly 11% by midsummer, the category has now flattened out, essentially stabilizing at this level. November’s rise to 11.5% marks significant upward movement, however, suggesting a gentle reacceleration as data-heavy pipelines and retrieval layers expand to support AI adoption.
Storage remained elevated, rising from 10.3% to 10.8% in November. The only true breakout jump occurred from August to September, and the category has held close to that higher baseline ever since. October looks more like a temporary dip than a new pattern, and November’s number puts Storage firmly back into its late-year range.
The “Other” category, which includes container orchestration, management overlays, and platform services, tells one of the clearer stories of the year. It hit 18.9% in April, then steadily declined into September. October’s bump to 15.5% now appears to be a brief interruption rather than a reversal. November’s 14.9% puts “Other” back in line with its late-summer normalization.
Networking, Analytics, and Management & Governance all tracked within their usual bands, without major shifts in November.
Finally, AI & Machine Learning saw one of the most notable movements of the month, jumping from 1.83% in October to 2.44% in November. It remains a small percentage of total spend in the service mix, but the acceleration is clear — and we’ll unpack the deeper dynamics behind AI cost growth in the next section.
Key Takeaways
- Compute remains dominant but softening: Now 49.3%, reflecting the growing pull of data and AI infrastructure.
- Databases stabilize after midyear decline: Up to 11.5%, marking a continued recovery from midsummer lows.
- Storage holds elevated: Near 10.8%, staying high after the late-summer jump.
- “Other” normalizes: Down to 14.9%, returning to the steady pattern established after April’s peak.
- AI/ML accelerates: Up sharply to 2.44%, with a deeper look coming in the next section.
3. Cost Of AI/ML
Here, we’re looking at how AI and machine learning costs are growing as a share of total cloud spend — shown as both average and median percentages to capture the full distribution of adoption across organizations:
AI/ML continues to climb steadily as a share of cloud spend, and the longer-term view from early 2024 makes the trend unmistakable. The average AI/ML share rose from 1.42% in January 2025 to 2.44% in November, while the median climbed from 0.21% to 0.56% during the same time period.
Both metrics are rising — but they tell different stories. The average shows ecosystem growth; the median reveals broader adoption.
Quick aside: You’ll notice the month-to-month curve looks smoother than in last month’s report. That’s because we excluded a small number of unusually large environments that were distorting the view of overall AI spend. With those outliers excluded, the trend now reflects what’s actually happening for the majority of organizations: a steady, compounding rise rather than abrupt shifts driven by a few extreme cases.
Back to the main plot — the average view captures total ecosystem growth. Larger enterprises with GPU-heavy workloads naturally pull the line upward; the rise from 1.21% in December 2024 to 2.44% in November 2025 shows how quickly AI workloads are becoming significant components of cloud budgets.
The median view tells the more democratized story. When looking at the median line, AI was just 0.13% of total cloud spend for the typical company in early 2024. By November 2025, it reached 0.56%.
That’s more than a fourfold increase in under two years of data, with most of that growth coming in 2025. That growth reflects a broadening shift: AI is no longer dominated by frontier-scale training efforts. Many organizations are now running inference workloads, fine-tuning smaller models, or integrating AI APIs into their products.
Together, these two lines reveal a layered narrative. The top of the market is expanding AI spend aggressively, but the middle is catching up fast. If 2024 was the year organizations experimented with AI in isolated workloads, 2025 is the year AI became a normalized — and increasingly non-trivial — line item in cloud spend.
Key Takeaways
- Average AI/ML share climbed from 1.42% in January to 2.44% in November, as part of a consistently accelerated investment among AI-heavy environments throughout 2025.
- Median AI/ML share rose from 0.21% to 0.56%, signaling widespread adoption across companies of all sizes.
- AI spend is compounding: both curves show steady, month-over-month growth — pointing to AI becoming a durable category of cloud cost.
Deep Dive — Why AI Costs Are Continuing to Rise (and What’s Changed This Month)
AI costs are still climbing, but the drivers look different from earlier in the year. December’s data shows a shift from experimentation toward productionization — and that brings a new cost profile. Inference is outpacing training, and costs now live in everything around the model — not just GPU time.
What’s driving the increase?
1. Pipeline sprawl
AI features now depend on multilayer pipelines — retrieval, embedding generation, vector search, monitoring — each adding steady, persistent cost.
2. Memory-heavy architectures
Larger context windows and multi-step agents are pushing teams toward bigger, more expensive instances, even when GPU compute isn’t the bottleneck.
3. Dataset growth outpacing cleanup
Training and inference datasets are accumulating faster than they’re retired, turning storage into a permanent AI cost center.
4. Inference traffic scaling with users
As AI features reach production, inference now scales with customer engagement — meaning workloads run constantly, not intermittently.
5. Growing efficiency gap across organizations
Some teams are optimizing (quantization, smaller architectures, caching). Others are simply scaling what they built — and paying more for it every month.
In other words:
It’s no longer right to say AI is getting expensive because it’s new. It’s getting expensive because it’s becoming normal.
Teams that master inference efficiency, pipeline governance, and dataset lifecycle management will pull ahead in 2026. Those that don’t will feel the compounding effect by spring.
Actionable Guidance
If your AI costs are rising faster than expected, focus on the parts of the pipeline that quietly compound over time. Start with visibility (you can’t improve what you can’t see), then tune the systems around the model, not just the model itself.
- Tame pipeline sprawl. Map every step in your retrieval or inference workflow and remove redundant calls, indexing jobs, or monitoring layers that add cost without adding value.
- Size for memory, not just compute. Many inference workloads are memory-bound. Experiment with smaller context windows, optimized batching, or instance families that match memory needs more closely.
- Enforce aging policies for training and inference datasets. Archive, tier, or delete training and inference data that’s no longer in active use. Dataset growth is a silent driver of AI cost.
- Tune inference early. Techniques like quantization, distillation, and response caching can cut inference cost dramatically once a model goes to production.
- Allocate costs by product or feature. Make it clear which teams or workloads are driving AI spend. Accountability drives better design decisions.
Bottom line: the goal isn’t just to control AI costs — it’s to align them with the value each model delivers. Pipeline hygiene, dataset discipline, and targeted optimization will separate teams who manage AI as a strategic investment from those who absorb it as overhead.
Your Takeaway For This Month
AI is now reshaping the cloud bill in ways that are structural, not situational. Compute is no longer the sole driver of spend, data layers are becoming long-term cost anchors, and AI workloads are steadily claiming more space in the budget. The organizations that stay ahead will be the ones that treat this shift as the new normal. They’re adapting quickly, tightening pipelines, and making high-velocity decisions as AI becomes part of everyday operations.
Bottom line: cloud economics just crossed a new baseline — and AI is now baked into it. The companies that win in 2026 will pair speed with discipline — building fast, but with visibility and guardrails baked in from the start.
Thoughts, comments, disagreements? Reply to this Pulse or email [email protected] with “CEP” in the subject heading. We’ll feature the best feedback in an upcoming issue. Watch for our next Cloud Economics Pulse on January 13, 2026, and on the second Tuesday of every month.


