Artificial intelligence (AI) is a technology that will continue to reshape the landscape of the entire world for the foreseeable future. But before we start to see the big effects of AI on the civilization it will have to first remodel the IT industry.
And for the past a couple of years this process has been going at an increasing pace. The first part of this is reforming data centers. They are already under a complete transformation with usage, capacity, construction, cooling, etc. AI workloads carry a hefty price that someone has to pay for the substantially bigger consumption of resources which were already scarce and at a premium. As a result, the costs are going up and naturally this is then transferred over to the end users via more expensive services.
This is a trend that will continue. Moody’s for example predicts that data center energy consumption will rise on average of 43% per year every year up to the end of 2028. The majority of this increase will be due to AI usage. Also, AI will drive up the average temperature of data centers, which means more cooling, i.e. even more resources needed. Not to mention the hardware itself will require more energy to crunch the AI workloads which are also expected to become a lot more complex. And while the developers are working on optimizing the AI models to make them easier on the hardware, the sheer amount of data and calculations that will be needed, means an inevitable rise in workloads.
All of this means data centers are heating up in every possible way: there’s lot of heat on operators to construct and provide more capacity to cover all of the demand. The heat is rising on the competition between providers to attract clients and between clients to be able to get the best locations and services. There’s heat inside the data centers, too. A lot of heat going everywhere. And where there’s too much heat, there’s a risk of overheating and damage. Could this happen to the data center industry? Yes, if things aren’t handled the right way.
Data centers are becoming hotter
Currently there’s no specific data on how much AI workloads have contributed to data center rooms running at higher temperatures, DataCenterKnowledge reports. The general consensus is that it’s happening, but the extent isn’t clear. As a basis, AI-related hardware tends to run at higher temperatures and also – its’ different design is a bit more difficult to cool.
This isn’t a new discovery. Researchers from several European universities did a study in 2020, saying that NPUs (Neural Processing Units – one of the cornerstones in AI hardware) is much more difficult to cool and thus these units “impose serious thermal bottlenecks to on-chip systems due to their excessive power densities”. This is because NPUs are generally smaller, but they generate a lot of heat which then is concentrated in specific areas. Thus increasing total cooling might be overkill which will lead to increased costs. On the other hand, applying smaller local cooling will make the whole system more complex and again – more expensive.
Thus AI data centers require a more flexible and innovative approach to cooling. One of the emerging solutions is liquid cooling along with its’ “brother” – immersion cooling. Both types tend to be more expensive to set up, but are proving to be quite effective for AI-related hardware. So, it’s a worthwhile investment if a data center is going to be focusing on AI workloads.
This is another key change in data centers that AI is bringing – differentiation. We used to think of data centers as one and the same. Now, AI is forcing different approaches, thus some facilities can focus on specific types of services which might be only for AI, mixed or only for traditional workloads like data storage, cloud, hosting, colocation, etc. This choice will be a deciding factor of what changes and upgrades are to be made to each specific data center and will reflect on its future clients. A data center optimized for AI hardware might have difficulties to offer colocation space and vice versa.
AI is forcing data centers to think
So, heat is building up and this is one of the main drivers of the cost increasing. But it can also be an opportunity. For example, data centers in locations close to residential areas or business parks could look into strategies for investing in the reusing of heat. Be it for the heating of areas where people and goods need better temperatures during colder months or even for other specific manufacturing processes.
Another option is to offer specific services which would be more interesting to businesses to lessen colocation. For example GPU-as-a-Service. This way existing GPUs in data centers can be utilized to their max potential within the same footprint, instead of having clients collocate their hardware or rent entire servers which would then spend a lot of time not used fully – something quite “wrong” in a time where every MW of data center capacity is in demand.
Now is the time for data center operators to carefully examine all of the ways they can maximize their resources, optimize costs and find ways to increase revenues, even via other ways than the obvious – providing of data center services.
AI is forcing regulators to think about data centers
Regulators aren’t oblivious to all of the aforementioned processes. In the US in particular the data center craze is causing concern in two very specific areas – the energy grid and water supply.
For example, regulators in the US are concerned that tech companies are consuming a lot more energy, but aren’t paying enough, thus leaving homeowners and small businesses to pick up the increased costs, The Washington Post reports. How is that possible? Easy. In multiple states the energy grid is strained due to the rise of data center energy consumption. This is driving the rates for everyone, not just tech companies. “A lot of governors and local political leaders who wanted economic growth and vitality from these data centers are now realizing it can come at a cost of increased consumer bills,” says Neil Chatterjee, former chair of the Federal Energy Regulatory Commission, to the Washington Post.
Naturally, tech companies are saying they are paying the same rates, although some do get preferential ones due to long-term PPAs. Some utility companies are saying the increased prices are for the benefit of everyone as the revenue goes for much needed improvements to the power grid which in many regions is in a dire need for upgrades.
Despite that, customers are not happy with their current bills and they will continue to rise. The expected increase for 2025 is up to 20% for customers in Maryland, Ohio, Pennsylvania, New Jersey and West Virginia. And in 2026 the prices might rise even more. Virginia’s biggest utility, Dominion Energy, projects between 2024 to 2035, residential electricity prices will grow at three times the annual rate compared to the last 16 years.
Despite all of the additional reasons like inflation, raw materials prices, green energy prices, etc., many are still pointing the finger towards tech companies as the main cause of this price spike. “The power drain of companies like Google is enormous,” said Buddy Delaney, whose family has been manufacturing and selling custom mattresses in the Columbia, South Carolina, area for 96 years. “We don’t think small businesses like ours should be subsidizing special electricity rates for these companies that have billions of dollars in revenue.”
Delaney is saying this because Google has a deal with Dominion Energy to pay 6 cents per kWh for power – less than half of what residential customers pay and less than what is the rate for SMBs. Delaney calculates that his company pays $1000 per month more than what Google would pay for the same use.
Google replies every such deal is happening after a review by regulators and ensures the negotiated price fully covers the utility’s cost to serve the company. Google’s head of data center energy, Amanda Peterson Corio, said in a statement that the company is “working closely with our utility partners in all the communities where we operate to ensure our growth does not impact existing ratepayers.”
Dominion also says the deal with Google “covers the investments required to serve the project, including transmission lines and other facilities,” and it includes “terms to ensure other customers, such as residential and small businesses, do not unfairly incur additional costs.”
To prevent such a development, Ohio wants to create a rule that tech companies have to pay upfront for their energy. Local utility AEP Ohio wants regulators to force tech companies to pay up to 85% of their projected energy use over a decade upfront. This “insulates our other customers — including residents, small businesses, manufacturers and other industries — from the impact of the necessary infrastructure improvements”, says the provider. Naturally, tech giants oppose the idea.
There are similar disputes in other states as well. It’s clear that a more centralized solution is needed. Many end users and SMBs feel they are paying for grid improvements that are serving mainly or only tech giants. IT providers are saying they are covering their costs and aren’t a burden on others’ rates.
It’s the same story for water supply. Climate change and water management issues already create challenges for the water supply, but “AI and data centers are increasing the scale of the challenge,” Rama Variankaval, global head of corporate advisory at JPMorgan and a leading contributor to the report made by the organization on the topic of the future of water resilience in the US, said in a statement shared with Quartz. Mishandling the impact on water-stress “could cause real disruption to global supply chains,” the report said.
On the other hand, Willie Phillips, chairman of the Federal Energy Regulatory Commission (FERC), said artificial intelligence and related technologies hold “generational significance” and belong in the US. “We have that opportunity today with regard to data centers, and we should not surrender it,” Phillips said at a highly anticipated FERC technical conference about building data centers next to power plants, quoted by DataCenterKnowledge.
FERC Commissioner Mark Christie notes, this has to be done the right way: “We want to serve data centers, absolutely, but we want to be fair to all the other consumers”. How would that happen is still up in the air. And that’s the main problem. It seems that regulators know there’s a problem, but don’t have an idea how to fix it. And it’s an issue that has to be tackled fast, before it becomes even more serious.