Data Center Boom Drives up Electricity Rates in Key U.S.States
Electricity costs are surging in regions with high concentrations of artificial intelligence data centers, impacting consumers and sparking debate over the financial burden on local communities. Rates in Virginia, Illinois, and Ohio have risen considerably in recent months, fueled by the immense power demands of AI infrastructure.
Virginia, home to 666 data centers-the most in the U.S.-experienced a 13% increase in electricity rates as of August compared to the same period last year, according to CNBC reporting on November 16th. Illinois, with 244 data centers, saw a 15.8% jump, while Ohio, hosting 193 data centers, recorded a 12% increase. These increases are two to three times the national average annual electricity rate increase of 5.1%.
The escalating costs are directly linked to the energy-intensive nature of AI model training and operation. Large-scale data centers, now frequently announced at the gigawatt (GW) level-equivalent to the output of a nuclear power plant and enough to power roughly one million homes-are being built by major tech companies like Amazon, Google, Microsoft, and Meta. Meta is planning a 1GW ‘Prometheus’ data center in Ohio, and OpenAI, alongside Oracle and Softbank, is developing a data center complex under the ‘Stargate’ plan in the same state. Google, Microsoft, and Claude are also expanding their presence in these regions.
The rising rates have triggered a political response. Virginia Governor-elect Abigail Spanberger, elected November 4th, has pledged to ensure “large technology companies pay their share” of the increased costs. Johns Hopkins University researcher Abraham Silverman notes a growing local resistance to further data center advancement, characterizing the backlash as a “techlash” and stating, “there is a response…that they no longer want data centers.”