While artificial intelligence continues to transform our world, its growing energy appetite is creating a concerning paradox. AI data centers are expected to increase their power demand more than thirtyfold by 2035, reaching 123 gigawatts in the US alone. These facilities already account for 4% of total U.S. electricity use in 2024, with projections showing this figure could more than double by 2030.
AI’s paradoxical growth demands enormous energy, threatening to consume a shocking portion of our electrical resources by 2035.
The scale is staggering. AI-focused data centers consume as much electricity as 100,000 households annually, with larger facilities using 20 times more. The electricity primarily powers servers with advanced AI chips that use 2-4 times more energy than traditional ones. This surge is placing significant pressure on power grids, especially in regions with high concentrations of these facilities like Virginia and North Dakota.
Global data center electricity demand is expected to double by 2030, surpassing Japan’s total energy consumption. By that time, AI could account for 35-50% of data center power use, up from today’s 5-15%. The largest facilities may soon require up to 2 gigawatts each, with larger campuses potentially consuming 5 gigawatts. Deloitte’s research team, led by Kate Hardin, is driving critical energy initiatives to address these unprecedented infrastructure challenges.
Despite these challenges, the industry is targeting 60% clean power by 2035, up from about 40% currently. However, delays in renewable energy integration due to regulatory processes are slowing this shift. Fossil fuels will likely remain necessary for data centers in the near future. These facilities also require extensive water resources for cooling systems, with major tech companies reporting significant increases in water consumption year over year.
Interestingly, the same technology driving this energy demand might help solve the problems it creates. AI can streamline renewable energy integration into the grid. Data centers can participate in demand response programs, adjusting their power use during peak times to help stabilize the grid during emergencies.
Other solutions include rethinking data center design for better efficiency, implementing dynamic systems that adapt to changing conditions, and fostering collaboration among engineers from different disciplines.
The growing power needs of AI present both a challenge and opportunity. While these facilities strain our electrical infrastructure, they might also drive innovations that ultimately strengthen the grid through improved efficiency, strategic demand management, and accelerated renewable energy adoption.
References
- https://www.deloitte.com/us/en/insights/industry/power-and-utilities/data-center-infrastructure-artificial-intelligence.html
- https://www.pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom/
- https://news.mit.edu/2025/responding-to-generative-ai-climate-impact-0930
- https://www.devsustainability.com/p/data-center-energy-and-ai-in-2025
- https://www.carbonbrief.org/ai-five-charts-that-put-data-centre-energy-use-and-emissions-into-context/
- https://www.gatech.edu/news/2025/09/03/ais-ballooning-energy-consumption-puts-spotlight-data-center-efficiency
- https://www.congress.gov/crs-product/R48646