Red Hat refocuses on AI, the cloud, and open standards  

11.06.2025 836 0

Red Hat has been among the most popular Linux distributions for quite some time now and for good reason. It has become a go-to choice for many companies, and its open-source offerings are key to their work processes. 

Now the world is entering the era of AI which means a lot of changes for everything and everyone – including the entire IT industry. So, Red Hat, the company behind the software, has to adapt and also get ready for the rise of artificial intelligence. This was extra evident during the Red Hat Summit 2025 which was held during the second half of May 2025. 

Its key message this year was AI, cloud, open-sourcing. Along with everything related like IT infrastructure, AI servers, inference, etc., Red Hat wants to show that the promise of AI can be defined by open source. All while keeping Linux at the center of the company’s work and keeping its position that the open hybrid cloud is the gold standard for enterprises.  

Meet the Red Hat AI Inference Server 

One of the key announcements during the Summit was the new Red Hat AI Inference Server. It allows companies to run generative AI applications faster and more efficiently. The new server is actually just software built on the vLLM project and uses technology from Neural Magic – Red Hat’s latest company acquisition. The main goal of the new server is to compress trained AI models and make them run more efficiently. Another benefit is the more efficient use of processor memory which should make faster inferencing across hybrid cloud environments. 

“AI puts a lot of stress on computing systems, and with the advent of AI agents, it will put a lot more stress in the future. Red Hat is saying they want to help you optimize your investments. As you go from model building to embedding it to your business processes or customer experiences, they will do everything they can at the software level to make sure you get maximum performance,” Rick Villars, IDC’s group vice president of worldwide research, told Data Center Knowledge. 

The Red Hat AI Inference Server will make generative AI models respond faster and is able to handle more simultaneous users without the need of additional hardware, says the company. It works on both AMD and Nvidia’s GPUs along with Intel’s Gaudi AI and Google TPUs, thus ensuring a wide range of compatibility with hardware for multiple configurations. Naturally, the server also works with a slew of AI models – DeepSeek, Google Gemma, Meta’s Llama, Mistral, Microsoft Phil and plenty of other LLMs. 

“Pre-optimized models running on vLLM often deliver two to four times more token production – so a much higher level of efficiency,” said Brian Stevens, Red Hat’s senior vice president and AI chief technology officer, during a media briefing. 

The AI Inference Server is the company’s implementation of vLLM and can be deployed as a standalone containerized offering or as an integrated component of Red Hat’s AI software portfolio along with Red Hat Enterprise Linux AI, OpenShift AI or others. 

“Openness leads to flexibility, and flexibility leads to choice. And that’s what enterprises want: choice. … While it feels like everything around you is changing, it’s always good to have something stable to stand on. Red Hat AI is the stable foundation for your enterprise AI,” Chris Wright, Red Hat CTO and senior vice president of global engineering, said during his speech. 

He says there are a lot of models gaining popularity which, of course, is a good thing. And Enterprises want the freedom to select model sizes, data and to be able to choose where to run the workloads and on which hardware. So, the AI Inference Server comes in at a key moment to help the company with its goal for the next big target – virtualization and open source. 

AI and virtualization are the next big moments for open source 

“Open source at its core removed barriers. It just unlocked human potential worldwide. … AI can unlock human and business potential the same way open source did. Because, at its core, AI removes barriers. While we might be in this moment of uncertainty between worlds, it’s up to us with that same bold spirit and principles that realized open source to start building these bridges,” said Matt Hicks, CEO of Red Hat during his keynote at the Summit.  

A big role in achieving this goal will be virtualization. Phil Guido, executive vice president and chief commercial officer for AMD, says that moving from legacy VMware to modern AMD Red Hat OpenShift setup can bring operational expenditure savings of up to 77%. It will also reduce energy consumption by more than 71%, CRN notes.  

And according to Red Hat’s senior vice president and chief product officer, Ashesh Badani, the business world is fully aware of this, driving an “overwhelming demand” for the company’s virtualization and hybrid cloud wares. For example, OpenShift Virtualization has seen almost a triple jump in the number of customers.  

“Last year, whether you liked it or not, most of you were given a virtualization price increase that you didn’t ask for. You faced a virtualization future that became very uncertain. We told you that Red Hat wanted to be your virtualization and hybrid cloud provider for the future. … We’re reaching every corner of the world … The rate and pace and change of technology is absolutely amazing and unrelenting. But one characteristic that’s remained consistent is that the open solution has ultimately prevailed. With operating systems in Linux, open won. With containers, Kubernetes and hybrid cloud, open won. And with virtualization and AI, we’re on track to have open win. When open wins, we all win. The open foundation we’ve laid for the hybrid cloud will be the underpinning of openness for AI,” Badani said. 

AI and Open source can benefit from each other. AI can get the additional freedom and flexibility to adapt without the fear of vendor lock-in. And Open source can ride the AI-wave to showcase its capabilities and benefits to the enterprise world.  

During the Summit, Red Hat introduced the launch of LLM-D – an open-source project to enable inferencing at scale with Kubernetes orchestration and AI-aware network routing techniques. CoreWeave, Google, IBM Research, and Nvidia are going to contribute code to the project. Also joining are AMD, Cisco, Intel, Lambda, Meta, Mistral AI and others. “The state-of-the-art is actually moving to distributed inference. What that means is your first tokens of your inference may be handled by specialized nodes and your second and ongoing tokens could be handled by different nodes optimized for serving those ups,” says Stevens.  

Virtualization growth 

All key players in the industry are supporting virtualization, and this is one of the main reasons for its growth in demand and usage. Google Cloud, Microsoft Azure and Oracle Cloud Infrastructure are all making Red Hat OpenShift Virtualization available. The software is now reaching general availability on Amazon Web Services and IBM Cloud, as well, the company noted.  

“Customers, when they are choosing their next-generation virtualization platform, want to go wherever their infrastructure choice leads them, and we had to really hone out and build out those relationships with our cloud providers,” said Mike Barrett, vice president and general manager of Red Hat’s Hybrid Cloud Platforms, in a media briefing. 

“A lot of customers who have Red Hat OpenShift also have VMware vSphere, so Red Hat already has a foothold. Red Hat is trying to take advantage of the fact that, ‘You know us as a good partner. We’re going to help you with the migration, and we’re going to make the migration as easy as possible for you’”, Jim Mercer, IDC’s program vice president of software development, DevOps and DevSecOps, said, quoted by DataCenterKnowledge. 

It’s time for Linux 10 with Lightspeed 

Another key announcement during the Red Hat Summit 2025 was the new version of Linux 10. Naturally, it focuses on AI, but it also builds on it with enhanced security and simplicity. The latest version wants to meet the demands of hybrid cloud providers and the transformative power of AI. Red Hat is adamant that the new platform is more than just an iteration. 

It’s “engineered to empower enterprise IT and developers to not just manage the present, but to architect the future,” said Gunnar Hellekson, vice president and general manager, Red Hat Enterprise Linux, Red Hat. It comes with generative AI management tools, several security enhancements and it’s so forward looking it even has preventions from quantum computing threats. Thus, this is the first enterprise Linux distribution to feature the FIPS-compliant Post-Quantum Cryptography. “Red Hat Enterprise Linux 10 provides the robust and innovative foundation needed to thrive in the era of hybrid cloud and AI,” Hellekson said, quoted by Channel Futures. 

The distribution features Red Hat Insights to help manage the platform. There’s also Insights in Red Hat Satellite 6.17 which is able to showcase recommendations even without an internet connection. An improved ‘image mode’ feature allows the OS to be deployed as a bootable container image.  

Linux 10 Enterprise also gets Lightspeed. It allows IT admins to use natural language to get assistance for everything in the OS – from troubleshooting issues to managing complex environments. From June 2025, Red Hat will also release OpenShift Lightspeed.  

“It helps to address the skill gap that we and our customers face with respect to the Linux skill set. What we are effectively doing is leveraging the AI playbook. Basically, we have a retrieval augmented generative app in the command line and you can effectively type in commands in plain English to troubleshoot issues,” Raj Das, Red Hat Senior Director of Product Management, said. 

These and many more announcements Red Hat made during its 2025 Summit show the company is more than ready for the AI era. It’s positioning itself to be a top contender for the enterprises by offering them exactly what they need – open-source freedom and the latest technologies on tap for them to get into the new age of technology and opportunities.  

Leave a Reply

Your email address will not be published.