How will AI improve data centers?

20.09.2023 832 0

Artificial Intelligence (AI) has been the biggest topic in the IT industry in 2023, and according to many reports, it is also the driving force behind data center demand. We’ve explored how data centers are changing to accommodate AI, but let’s now also look at how AI will actually change data centers.

Change must happen. According to a new report by JLL, AI workloads and the cloud will be the reason for “explosive demand” for data centers in the near future. In fact, the demand will be so big, it will lead to shortage of colocation space and rising prices. JLL says that the current supply of data center capacity is not enough to meet the demand as several key industries are starting to compete with each other for it, specfically the finance, healthcare and cloud service provider industries among the rest.

“We’ve never been in a situation with such a supply and demand imbalance. A year-and-a-half ago, we had a healthy vacancy rate in most markets around the U.S. That meant we could pretty much find a home for all the requirements that are out there, and that has completely changed,” said Curt Holcomb, Executive Vice President and co-lead of JLL’s data center markets practice to DataCenterKnowledge.

When the “AI craze” hit, there were a lot of AI companies that started to compete for data center resources along with traditional enterprise customers, says Holcomb. This quickly led to a capacity shortage and a rise in rental prices. And the fact that a lot of data centers are being built right now won’t really solve this. Why? Because most of the new data center supply that will be delivered in 2023, 2024 and possibly most of 2025, is already pre-leased and under exclusivity agreements. As a result, only a small amount of the newly built capacity will actually reach the market.

“Our advice to our clients is that you need to think about your requirements and think about timing. If you think you’re going to need capacity at the end of 2024, you better be in the market looking today because you’re competing with all the other users that need space as well. So, you need to get in line. You need to figure out what provider you like, and then you need to cut your deals before they even break ground on their buildings,” Holcomb says.

AI to the rescue

So, the world needs more data centers. That means a lot more buildings, more resources, more greenhouse emissions, and other potential challenges. Data centers and cloud computing operations in general already consume a lot of energy and generate an increasing number of emissions. Data center operators are constantly searching for ways to improve their energy consumption and to be carbon neutral or even carbon negative.

This requires effort not only in the power supply and cooling fronts, but it also must cover automation, maintenance, security and basically every other aspect of the data center. Some of the solutions are already in use, while others are being developed or are still just in the concept and ideas phase. One thing is for sure – AI is already part of many data centers and will continue to increase its influence in them.

Gartner predicts that by 2025, AI will be operating in half of all data centers in the world. The main goal of AI in the data center will be the obvious one – to find and correct inefficiencies and to further improve and optimize resource usage as much as possible. With data center workloads expected to increase by 20% every year for the next few years, honing data center operations is now an urgent task. However, it won’t be as easy as getting an algorithm and tasking it to improve the parameters it has been given, InformationWeek notes.

It all begins from the data

AI can be very useful and can give incredible insight and ideas. But only if it has proper data to rely on. Data is both the greatest weakness and strength of AI. Data quality has a massive impact on how an algorithm will work and what ideas and solutions it will produce.

Naturally, the same is valid for data center operations, too. If we want AI to make data centers even better, we need to provide the best data we can. Unsurprisingly, it turns out it’s not that easy. Certain types of data haven’t been useful up until now and haven’t been collected, says InformationWeek. Others haven’t been collected in a proper way or stored long enough.

As a result, most data center operators are better off if they start collecting data from scratch. This would mean they need more time to gather big enough data samples for an extended period of time so that the AI can learn as much as possible.

Whilst it’s easy to simply say “collect any and all data,” that could be an issue, it creates a risk of “polluting” the data sample with redundant information, aka “noise.” So, how do you pick the right data? There’s no right answer, sadly. It could involve some trial and error to pinpoint all the possible data items depending on the data center configuration, setup, and goals.

Absolutely necessary data would be all the obvious – available storage, number of servers, server configuration, CPUs (Central Processing Units), the number of machines running at a given time, traffic volumes, etc. All data related to power consumption and cooling is also an absolute must. For that you might also want to include data about inside and outside conditions, temperatures, weather, etc. All of this will influence the data center operation on a continuous basis, so the AI will have to know about them.

“In order to be able to build a proper machine learning AI system, you would need all of that to really dial in the efficiencies. All of that matters. Every one of those data points can skew the other,” Eric Swartz, VP of engineering for DataBank. This is why it’s important to pinpoint the exact data you need. And you may use AI to help you hone what data you do and don’t need to collect. But as a rule – the more, the better.

Simulations and digital twins

The main goal for AI would be to improve data center operations, right? You don’t need to wait to trial and error its ideas and solutions in real world conditions. The creation of a digital twin of an asset is an increasingly popular approach for many goals, including improving and optimizing data centers.

The digital twin should feature as close to possible virtual representation of the asset, in this case the data center. This is a great way to help the AI to gather data and study, but also to test its ideas and optimizations in a safe way without risking disrupting actual operations. The digital twin also allows us to test different conditions, increase in workloads, weather changes, etc. And the digital twin can also help you pinpoint what other data you need to collect from the real data center in order to further improve the AI’s work.

Creating a digital twin of a data center will require consistent and continuous effort from your team. It will need to be kept up to date with the current data centre and improved parameters to keep them realistic. One of the major challenges with digital twins is that if the model is not refined it can start offering solutions which aren’t possible in the real world.

But if you manage to keep the digital twin a real twin and hone the parameters within what is realistic, then you can have a big asset on your side. The twin will allow you to test various approaches for your data center and improve every aspect of it, including the most important ones like cooling, power consumption, security, and maintenance.

The hidden challenges

Of course, implementing AI to overlook and optimize your data center and then turning its ideas into a reality might open a whole new world of challenges. It is not just about data collection and analyzing.

What is also important is to keep the connection with tenants, clients, and even other data centers open. Data centers are used by a substantial number of clients from various industries. They need to be informed beforehand about the AI and its operations. And privacy agreements with these clients will have to be honored by AI. So, gathering some specific client data might be off limits for AI to optimize and to analyze. As a result, this can create gaps in the model which have to be accounted for by other means.

Of course, what AI models are capable of now pales in comparison to what they will be able to do in five or more years’ time. At least that’s the hope. Right now, algorithms can help with some of the basic tasks and forecasts, but they won’t be of much use for unforeseen anomalies and other sudden issues or emergencies.

Despite that, AI already does have its major role in the present and future of data centers. As such, data center operators must start adapting to it. They have to start viewing AI as a tool and see how they can use it to the best advantage for their needs. This will result in changes in responsibilities for both AI and the human teams. These changes won’t be one-and-done. They will have to be done on a continuous basis as AI and the data centers evolve.

And let’s be honest, it will take time before AI gets full trust from humans, especially in a business environment. Misconfigurations, bad or “polluted” data, wrong assumptions and biases can lead AI on some very inefficient roads. As such, human oversight will be needed for a long time to come before we can safely allow AI to take control of anything, including data centers. But it can and should be a great assistant.

Leave a Reply

Your email address will not be published.