The data center is changing to welcome artificial intelligence

31.05.2023 996 4

The era of artificial intelligence (AI) is upon us, according to Bill Gates. A few weeks ago, he famously proclaimed that we are in the age of AI and by the looks of it he’s right. In fact, it may have begun a while ago without anyone fully realizing.

For the regular person, AI is either something exotic… or completely the opposite and regarded as simple. This is because – for the most part – the only interaction most people have with AI is something like ChatGPT or a similar service. But for the companies that run these services, AI is far more than that. It’s already a very complex task and it’s only going to get more challenging, but it will change absolutely everything in IT.

This includes data centers, too. In fact, many of them are already changing and companies are working hard on designing entirely new data center architectures specifically for artificial intelligence. Some of them are ready and have been shown off recently; big names like Meta and others are all working on their projects with different approaches. Suddenly, data centers and hardware for AI have become a hot topic and a dynamic niche in the world of IT. Let’s explore some of the latest developments and see where the data center is heading.

The data center of the next generation

AI comes with a slew of specific requirements. The gist of them is simple though: “A lot of everything.” AI needs much more storage, a lot of computing power, and optimization of every detail with the same goal – maximum speed and efficiency. Sounds quite simple, right? Simply reimagine the entire data center architecture.

Meta is doing exactly that. Over the past few months, the IT giant has reportedly quietly shifted its main focus away from the metaverse and towards AI. Recently the company even previewed its work so far on optimizing data center hardware and architecture exactly for AI.

This work includes shifting to liquid cooling for a “significant percentage” of its AI hardware, and entirely new ASIC chips which are made to handle heavy AI workloads. The company showed off its work during the AI Infra @Scale conference and seemed pretty proud of its accomplishments.

“We’ve been building advanced infrastructure for AI for years now, and this work reflects long-term efforts that will enable even more advances and better use of this technology across everything we do,” said Meta CEO Mark Zuckerberg, quoted by DataCenterFrontier.

“We’re reimagining everything we do about IT infrastructure for AI. We’re creating data centers that are specific for AI. We’re creating new hardware, including our own silicon. Thousands of engineers are innovating on this large-scale infrastructure that’s built specifically for AI,” said Aparna Ramani, VP of Engineering, Infrastructure at Meta.

Meta says that thanks to this new work, it will be able to build data centers faster and cheaper and save up to 31% compared to the current design. The new data centers will also use “greener” materials like concrete with less carbon footprint, etc. The goal is to be water positive and net-zero emission-wise in the long run.

AI requires far more resources

Achieving the sustainability goals is going to be a challenge, considering AI will require a lot more resources. “Meta’s AI compute needs will grow dramatically over the next decade as we break new ground in AI research, ship more cutting-edge AI applications and experiences for our family of apps, and build our long-term vision of the metaverse,” writes Santosh Janardhan, Head of Global Infrastructure at Meta.

So, the company is already planning for “roughly 4X scale.” Meta currently has 21 data center campuses around the world, costing more than $16 billion. These campuses often feature multiple data center buildings – a total of 80. Meta is planning to double these by 2028 to 160. Of course, each new data center will be more powerful and will continue to evolve.

One of the new creations is the MTIA chip (Meta Training and Inference Accelerator). It’s an ASIC chip, customized for AI. Meta says MTIA will in fact be twice as efficient as the GPUs used in most AI infrastructures. But there’s another challenge. It will generate more heat, so it will also require “new approaches” to the overall data center design, and they are also in the works. The goal is to start their implementation by 2025.

Meta also says that it expects that AI chips will consume more than five times the power of its current typical CPU servers. Of course, the heat will also increase. So, to achieve both peak performance and maximum energy efficiency, Meta is planning to go for liquid cooling. It will employ two phases. The first one will see the use of air-assisted liquid cooling with cold plates. This approach will be for the existing data center halls as there’s no need to install additional piping.

The second phase will be for the new data center design in 2025. It will use a custom liquid cooling solution of both a cold plate with liquid running through it and then that plate and the processor are both immersed in more coolant. Meta plans to gradually expand this solution and scale it “as we need.” This is to lower costs and accelerate deployments.

AI needs power, too

Obviously, all these changes to the architecture will also require more power. Meta has reworked its new design to make it simpler to deliver power to the server rack in a more efficient way. The aim is to eliminate as much equipment as possible through the power distribution chain.

One of the solutions will be reducing switchgears that create bottlenecks. As a result, server rack density can be increased with minor modifications, says Meta. The new design factors in that liquid cooling will also require more power, as “we can’t just open our windows and rely on free air cooling anymore.”

The IT giant is also fully aware that there will be more challenges and changes. “We’re in the middle of a pivot to the next age of information. AI workloads are growing at a pace of 1,000x every two years,” says Alexis Bjorlin, VP of Engineering Infrastructure at Meta. She adds that generative AI workloads are going to require a far bigger scale. “Whereas traditional AI workloads may be run on tens or hundreds of GPUs at a time, the generative AI workloads are being run on thousands, if not more,” she says.

AI helps change the data center

So far it seems that AI would only demand changes from the data centers but doesn’t give back. This is not the case, notes DataCenterKnowledge. In fact, artificial intelligence does and will have an important role in the next generation of data centers.

For example, it will help improve the physical security of the facility. AI can analyze data and video feeds to detect physical intrusions, malfunctions or other issues at the very early stages of the incidents. Better yet, it could also identify potential risks and flag the personnel on hand to be ready. AI won’t replace on-site employees, but it will greatly help them.

AI will help with incident response, too. There are a lot of possible incidents to happen at a data center: power failures, fires, overheating, cooling failure, various cyberattacks, physical attacks or simply human error. AI can constantly monitor various data sources and assess situations in real time, helping operators pinpoint possible risks and offer them solutions when there’s a need for them. It could even automate some of the responses and act much faster.

Artificial intelligence will also help with energy management. It can analyze energy consumption, identify areas that can and should be optimized and even offer solutions. Also, it again can help predict usage spikes, help manage energy sources and improve overall consumption.

On the same note, AI can and will help data center operators manage and optimize their capacity both physically and within the servers. The algorithms will be faster and offer more options and probabilities to the operators, so that they can make their decisions on time.

Finally, a lot has been said that AI will soon be able to develop itself and not only write code but self-improve and evolve. The same will happen with the hardware that AI needs. Artificial intelligence will help develop the hardware that it needs to run even better. The demand for AI-specific hardware and data centers is just starting to grow. It’s a natural step that it will be AI which will help meet these demands and even offer more than expected. It will improve everything, including the AI services themselves. And data centers will continue to be the foundation for everything.

This will be beneficial to the cloud in general, as well. Pretty much all of the big cloud platforms already offer some sort of AI services for their clients. Their portfolios will only continue to get bigger and bigger. The competition is going to be fierce, and it will drive even more investments into data center architecture and optimizations to tailor the facilities for AI.

4 replies on “The data center is changing to welcome artificial intelligence”

ลอตเตอรี่ออนไลน์ คืออะไร

… [Trackback]

[…] There you can find 94072 more Info on that Topic: blog.neterra.cloud/en/the-data-center-is-changing-to-welcome-artificial-intelligence/ […]

ร้านไวน์อุบล

… [Trackback]

[…] Find More Information here on that Topic: blog.neterra.cloud/en/the-data-center-is-changing-to-welcome-artificial-intelligence/ […]

http://elistingtracker.olr.com/redir.aspx?id=112365&sentid=161371&email=zae@mdrnresidential.com&url=https://gasdank.com/

… [Trackback]

[…] Read More Info here to that Topic: blog.neterra.cloud/en/the-data-center-is-changing-to-welcome-artificial-intelligence/ […]

Leave a Reply

Your email address will not be published.