Microsoft and Nvidia have joined forces to accelerate artificial intelligence (AI) development by integrating Nvidia’s Blackwell platform with Azure AI services. This collaboration introduces significant advancements, including the integration of Nvidia NIM into Azure AI Foundry, a platform designed to streamline the creation of AI applications.
NIM essentially provides developers with pre-built AI components, allowing them to create AI programs more rapidly and efficiently. Furthermore, Microsoft has unveiled the Azure ND GB200 V6 virtual machine (VM) series, which leverages Nvidia’s Blackwell architecture.
This VM series features Nvidia Quantum InfiniBand networking and the Nvidia GB200 NVL72, a liquid-cooled supercomputer designed for high-performance AI workloads. It is also compatible with existing VMs that utilize Nvidia H100 and H200 GPUs. These new cloud-based machines are specifically engineered for demanding AI tasks, utilizing Nvidia’s latest technology to significantly speed up the processing of AI functions.
Another key development is the incorporation of Nvidia’s Llama Nemotron models, which are sophisticated AI models capable of logical reasoning and problem-solving. This capability enables businesses to develop custom AI assistants tailored to their specific requirements.
These developments are already yielding tangible results. For instance, Epic, a healthcare software provider, is utilizing this technology to enhance patient care and improve overall healthcare efficiency. For additional details, you can refer to the official Azure Blog.