Microsoft has said that its Azure cloud platform will be the first in the world to be running NVIDIA's brand new Blackwell GB200 AI servers.
Microsoft Azure shows off its NVIDIA Blackwell GB200 powered AI servers, claiming to be the first among all cloud platforms
Microsoft today showed off its newly built NVIDIA's Blackwell GB200-powered server for the Azure AI cloud computing platform. The official handle of Microsoft Azure posted that they are the first cloud system having GB200-powered AI servers for scaling advanced AI models.
Microsoft Azure is the 1st cloud running @nvidia's Blackwell system with GB200-powered AI servers. We're optimizing at every layer to power the world's most advanced AI models, leveraging Infiniband networking and innovative closed loop liquid cooling. Learn more at MS Ignite. pic.twitter.com/K1dKbwS2Ew
— Microsoft Azure (@Azure) October 8, 2024
Microsoft Azure offers its customers services like virtual machines, AI processing, etc. for managing the applications. This allows its users to scale and upgrade their applications without owning the hardware themselves. With the usage of the latest NVIDIA Blackwell B200 GPUs, Azure is enhancing user experience by offering its users higher performance than ever.
The GB200-powered AI servers will utilize the flagship data center B200 GPUs, which utilize the GB200 die and offer 192 GB of HBM3e memory. The GPU is a high-performance chip aimed at advanced and heavy workloads such as deep learning, training large AI models, and processing large datasets while being more efficient than its predecessors.
Through the usage of B200 GPUs, the AI models can be trained faster at Azure, ensuring its leading performance among all other cloud computing platforms. As shown in the picture, the company has a server rack with several B200 GPUs. We don't know how many B200 GPUs are used inside this server and how many of these the company has deployed yet.
The server is being cooled by liquid cooling solutions to maintain lower temperatures, which looks like the initial test phase by Microsoft to see how to implement liquid cooling for commercial servers.
It should be kept in mind that the shown server isn't GB200 NVL72, which NVIDIA has prepared for leveraging the power of 36 Grace CPUs and 72 B200 GPUs. That rack is insanely powerful for building a powerful platform that can yield up to 3240 TFLOPS of FP64 Tensor Core performance and is going to be used in Taiwan's fastest supercomputer by Foxconn.
We recently reported about OpenAI showing off the DGX B200 system on X and looks like many more are joining the race to use NVIDIA's Blackwell chips.
News Source: @Azure
Follow Wccftech on Google to get more of our news coverage in your feeds.
