In some cases, they are considering nuclear energy, or even bringing back oil, gas and coal power plants that had been slated for closure. Despite their resource demands, data centers still account for only a small share of total U.S. water use. While some data centers are in regions with abundant water and easily accessible without competing with other users, others may be built in areas of drought with degrading infrastructure. More than 160 new AI data centers have sprung up across the US in the past three years in places with scarce water resources.
- From 2024 to 2030, data centre electricity consumption grows by around 15% per year, more than four times faster than the growth of total electricity consumption from all other sectors.
- In the U.S., data centers have an indirect water footprint of about 800 billion liters.
- Looking ahead, CBRE anticipates continued greenfield development across emerging U.S. markets, particularly along the Interstate 20 corridor and in deregulated electricity markets.
- Edwards has nearly 20 years of experience scaling technology companies, and his insights into the needs for data centers could help Bloom continue to capitalize on the growing demand for power from the AI industry.
Similar to trends seen in the early 2010s, the improvements in efficiency are expected to offset most of the impact of increased IT stock utilisation. This leads to a plateau in energy demand at around 700 TWh, limiting the growth of the data centre share of global electricity demand to less than 2% in 2035. Although data centre electricity consumption globally has grown only slightly, some smaller countries with expanding data centre markets are seeing rapid growth.
Small and Medium Data Centers
Utilities often must make expensive upgrades to power grids so they can handle increased energy demands from new data centers. Smaller businesses and U.S. households often shoulder these costs unless ratepayer protections are put in place. (MADISON) Large data centers in Wisconsin will have to pay all of the costs for the increased energy demand they create, and not regular utility customers like homes and other businesses. Increasingly, hyperscale data centers use closed-loop cooling systems, in which water circulates through pipes and absorbs heat from servers. The warmed water is carried away to be cooled, often using heat exchangers, chillers or cooling towers. According to a recent report from the International Energy Agency, data centers accounted for 1.5% of the world’s electricity consumption, of which the U.S. makes up the largest share.
Pledges from hyperscalers
National Science Foundation (NSF), the Department of Energy (DOE), and the Defense Advanced Research Projects Agency (DARPA), which have placed importance on energy-efficient computing, will be essential in advancing sustainable AI research. Several strategies can reduce AI’s environmental footprint while maintaining technological advancements. One approach is to optimize AI models to use fewer resources without significantly compromising performance, making AI more energy efficient. Instead of training large general-purpose models from scratch, researchers can develop domain-specific AI models that are customized for particular fields, such as computational chemistry or healthcare, reducing the computational overhead. AI model training involves training, or adjusting, billions of parameters through repeated computations that require immense processing power. Each training session can take weeks or months, consuming massive amounts of electricity.
Enact policies to encourage energy efficiency, demand response and clean energy procurement
A number of factors contribute to high utility prices, including the cost of upgrading and managing outdated grid infrastructure, expenditures that were rising long before the AI boom kicked off. But data centers’ ravenous energy needs have nonetheless received the brunt of the blame, with polling suggesting most households connect data center expansion with rising electricity costs. Lawmakers have acted accordingly, with bipartisan calls to monitor data center construction often packaged around affordability concerns. Northern Virginia, the world’s largest data center market, has approximately 4,000 MW of capacity.
Big data centers are power-hungry, but increasingly efficient
Medium-sized facilities may consume 5-20 MW, serving regional needs or specialized applications. These centers often achieve better efficiency than smaller facilities due to economies of scale in cooling and power distribution systems. Nationally, electricity rates have already risen for consumers in recent years, in part because utility companies have been replacing aging equipment to safeguard against extreme weather events and cyberattacks.
But further efforts are needed to maximise additionality and emission reductions of renewable energy purchases
A single large hyperscale data center can consume + MW of power continuously, equivalent to powering 15,000-75,000 homes. The largest facilities consume over 650 MW—enough electricity for nearly 500,000 homes. In 2023, U.S. data centers collectively consumed 176 TWh, equivalent to powering 16 million homes for an entire year. Data centers are the invisible backbone of our digital world, powering everything from social media feeds to critical business applications. In 2023, U.S. data centers alone consumed 176 terawatt-hours (TWh) of electricity—equivalent to powering 16 million homes for an entire year.
Energy Technology Perspectives 2026
He was a reporter for the Triangle Business Journal, Raleigh News and Observer and most recently a tech reporter for CRN. He was also a top wedding photographer for many years, traveling across the country and around the world. Workloads are increasingly segmented based on their individual requirements, from processing speed to privacy laws – and it is all in incredibly high demand. “The widespread adoption and scaling of AI projects are rapidly becoming a significant driver of networking constraints for enterprises. While organizations are already encountering these limitations in 2025, the trend is projected to intensify in 2026 as companies successfully validate and deploy AI solutions on https://livingspainhome.com/a-smooth-transition-to-european-homeownership-with-kittenproperties.html a larger scale. Together, we power an unparalleled network of 220+ online properties covering 10,000+ granular topics, serving an audience of 50+ million professionals with original, objective content from trusted sources.
For smaller data centers, the split tends toward 50% IT equipment and 50% infrastructure. Larger, more energy-efficient data centers allocate a higher proportion of energy to computing and data storage. This distribution means that as AI adoption expands, the cumulative energy impact of millions of daily queries will increasingly dominate total AI energy levels in data centers worldwide. Energy typically accounts for 15-25% of total data center operating expenses, with maintenance at approximately 40% and other costs making up the remainder.
User account menu
Electric and gas utilities requested more than $30 billion in rate increases last year, according to a January analysis by PowerLines, a consultancy, affecting 81 million Americans. Despite growing electricity consumption, the data center industry has made significant efficiency https://allnewstoday365.com/on-the-importance-of-automation-of-the-gas-level.html improvements and sustainability commitments. This shift is driving data centers to upgrade power infrastructure, cooling systems, and electrical distribution to handle much higher power densities. Power distribution systems, including Uninterruptible Power Supplies (UPS) and backup generators, consume 10-15% of total electricity. These systems ensure continuous operation during power outages but introduce efficiency losses through power conversion and battery charging. Advanced energy storage systems are increasingly being integrated to improve reliability and efficiency.