
The cost of AI is decreasing
While creating The Ramp AI Index, we discovered something big: AI is getting a lot cheaper.
AI models are powered by fundamental units called tokens, which are processed in large, energy-intensive data centers. AI companies have “pay-per-token” enterprise pricing, billing customers based on usage. Ramp data indicates that these AI tokens are rapidly decreasing in cost.
A year ago, businesses were paying $10 per one million tokens. As of March 2025, it’s $2.50—that’s a 75% decrease. Even when OpenAI launched its most expensive model to date, o1-preview, in late 2024, the average cost rose only slightly before dropping again.
What changed?
We attribute this sharp drop in cost to rapid improvements in AI technology resulting from a highly competitive market.
The AI space is crowded, so companies like OpenAI and Anthropic must optimize quickly to grow their market share. Faster, more efficient models launch every few weeks. The electricity necessary to power AI data centers is a considerable expense, so more efficient models make tokens cheaper to process.
Energy and the future cost of AI
AI models use far more electricity than traditional computing, so the industry’s rapid growth has sparked concerns over energy and water consumption. Municipalities and tech companies are investing heavily in expanding power grids, tapping into multiple energy sources, and optimizing energy usage to meet these demands.
There’s no question that AI will continue requiring massive amounts of electricity. But if energy infrastructure improves while AI companies continue prioritizing efficiency, we can expect the cost of tokens to drop further. We could even be looking at a future where AI is as cheap as a Google search.