The race to power artificial intelligence is creating a new problem—and potentially a new solution. As tech giants build massive data centers to handle AI workloads, these facilities are generating enormous amounts of waste heat. Now, companies and cities are exploring ways to capture that heat and put it to use, turning what was once a disposal problem into a potential energy resource.
The challenge is real and urgent. A single large AI data center can consume as much electricity as a mid-sized city. By 2030, nearly 100 gigawatts of new data center capacity will be added globally, and electricity demand from these facilities is expected to nearly double to about 945 to 980 terawatt-hours per year. For context, U.S. data centers alone will consume about 8% of all electricity in the country by 2030.
But the heat these centers produce is not just a waste product—it is an asset waiting to be captured.
Background
Data centers have always generated heat. Servers running AI models can reach temperatures above 200 degrees Fahrenheit and need to stay below 150 degrees to work properly. Cooling these facilities has traditionally required massive amounts of water and electricity. An average mid-sized data center consumes more than 35,000 gallons of water per day. Cooling systems account for up to 40% of a data center's total power consumption.
As AI workloads have grown, the cooling problem has become more acute. Companies are packing servers more densely into facilities, pushing power density from 162 kilowatts per square foot to 176 kilowatts per square foot by 2027. This intensification means more heat to remove and more resources required to do it.
The traditional approach of using cooling yards with sprawling networks of pipes and fans is becoming unsustainable. Communities across the country have begun protesting data center construction, worried about the strain on local power grids and water supplies. Some regions are facing genuine capacity constraints that could limit where new data centers can be built.
Key Details
New Cooling Technologies
Innovation is happening on multiple fronts. A Los Angeles startup called Karman Industries has developed a cooling system using technology derived from SpaceX rocket engines. The system uses liquid carbon dioxide as a refrigerant, circulated by pumps that spin at 30,000 revolutions per minute—nearly 10 times faster than traditional compressors. The company says its system can reduce the space required for cooling equipment by 80% and use less than half the energy of conventional systems while requiring no water at all.
"Our high-level thesis is we could build the best compressor out there using the latest and greatest technology. We want to reduce that electrical consumption of cooling so that you have the most efficient way to cool these chips." – David Tearse, CEO of Karman Industries
Karman raised $20 million in recent funding and expects to begin customer deliveries in summer 2026 from its Los Angeles manufacturing facility. The company plans to make 100 units per year initially, with plans to quadruple that capacity.
Microsoft has taken a different approach, announcing a new data center design that uses zero water for cooling. The company has also committed to ensuring its data centers do not increase electricity costs or deny water to nearby communities.
The Heat Recovery Opportunity
While cooling technology improves, the real opportunity lies in capturing the heat that data centers produce and redirecting it for other uses. Cities and companies are beginning to explore district heating systems that could use data center waste heat to warm buildings and neighborhoods. This approach has been tested in other countries and is now gaining attention in the United States.
The economic incentive is clear. The data center cooling market alone is projected to grow from $11 billion in 2025 to nearly $25 billion by 2032. Heat recovery systems could add another revenue stream and reduce the environmental impact of data center operations.
Orbital Solutions
Some tech companies are exploring even more ambitious approaches. Companies including NVIDIA and Google are investigating orbital data centers—facilities placed in space where solar panels can be up to 8 times more efficient than on Earth because there is no atmosphere to block sunlight. Heat can be released directly into space through radiation at temperatures as low as 4 Kelvin (minus 269 degrees Celsius), eliminating the need for water-based cooling entirely. This idea has moved from theory to early testing in late 2025 and 2026.
What This Means
The shift toward heat capture and recovery represents a fundamental change in how data centers are viewed. Rather than being purely cost centers, they are becoming potential energy assets for communities. In regions facing water scarcity, like Texas and Arizona, the ability to cool data centers without water could unlock new locations for facility expansion.
For cities and utilities, the opportunity is to work with data center operators on infrastructure that benefits both parties. Several cities are already preparing grid upgrades to handle the power demands. San Jose and its utility PG&E are preparing massive grid upgrades, but AI demand could nearly triple the city's energy use.
The industry is also rethinking its priorities. Power efficiency is now measured in a new metric called "tokens per watt per dollar"—how much computing output you get from each unit of energy spent. This shift means companies are focused not just on using less energy, but on using energy as effectively as possible.
For now, the data center industry is in a race against time. Power constraints are the biggest barrier to expansion. Companies that can solve the heat and cooling challenge will have a competitive advantage. Those that can turn waste heat into an asset will have an even bigger one.
