Skip to content
World Today News
  • Home
  • News
  • World
  • Sport
  • Entertainment
  • Business
  • Health
  • Technology
World Today News
  • Home
  • News
  • World
  • Sport
  • Entertainment
  • Business
  • Health
  • Technology
Friday, March 6, 2026
World Today News
World Today News
  • Home
  • News
  • World
  • Sport
  • Entertainment
  • Business
  • Health
  • Technology
Copyright 2021 - All Right Reserved
Home » NVIDIA DGX
Tag:

NVIDIA DGX

Technology

Lilly & NVIDIA Launch AI Factory to Accelerate Drug Discovery | NVIDIA Blog

by Rachel Kim – Technology Editor February 27, 2026
written by Rachel Kim – Technology Editor

INDIANAPOLIS – Eli Lilly and Company has activated LillyPod, the pharmaceutical industry’s most powerful supercomputer, built in collaboration with NVIDIA. The system, powered by 1,016 NVIDIA Blackwell Ultra GPUs, is now operational at Lilly’s Indianapolis campus, marking a significant investment in artificial intelligence for drug discovery and development.

The launch, announced Wednesday, culminates a partnership between Lilly and NVIDIA that began in October 2025 with the goal of creating an “AI factory” capable of managing the entire AI lifecycle, from data ingestion to model training and deployment. LillyPod delivers more than 9,000 petaflops of AI performance, assembled in just four months, according to Lilly officials.

“It’s a big day for us with the supercomputer coming on board, but it’s a day 150 years in the making,” said Diogo Rau, executive vice president and chief information and digital officer at Lilly. “LillyPod is a powerful symbol of who we are and why we do this operate: to make life better for people around the world. We are, right here, right now, at the right moment to advance biology in a way that has just never been done before.”

The supercomputer will be utilized across a range of scientific disciplines, including genomics, molecule design, single-cell biology, imaging, and manufacturing operations. Lilly’s genomics team will leverage LillyPod’s capabilities to analyze 700 terabytes of data using over 290 terabytes of high-bandwidth GPU memory. Thomas Fuchs, senior vice president and chief AI officer at Lilly, emphasized the necessity of such computational power, stating, “Computation is at the heart of biology and it is at the heart of science. Being able to compute at scale is not something optional for a company like ours, it is absolutely necessary. So we are building the computational future of medicine.”

LillyPod is designed to support the training of complex AI models, including protein diffusion models, small-molecule graph neural network models, and genomics foundation models. NVIDIA’s full-stack AI factory architecture, incorporating accelerated computing, NVIDIA Spectrum-X Ethernet networking, and optimized AI software, provides a secure and scalable platform for the highly regulated healthcare and life sciences sector. NVIDIA Mission Control software will manage the DGX SuperPOD, orchestrate workloads, monitor performance, and automate AI operations.

The infrastructure consists of nearly 5,000 connections built with over 1,000 pounds of fiber cables. Lilly has committed to powering its new AI infrastructure with 100% renewable electricity by 2030, utilizing efficient liquid cooling to minimize energy impact.

Lilly plans to make select models available through Lilly TuneLab, an AI and machine learning platform offering biotech companies access to drug discovery models built on proprietary Lilly data, generated at a cost exceeding $1 billion. TuneLab will also offer NVIDIA BioNeMo open foundation models for healthcare and life sciences, utilizing a federated learning infrastructure built on NVIDIA FLARE to ensure data privacy.

According to Lilly, the supercomputer addresses a key limitation in traditional drug discovery, which is constrained by the physical capacity of laboratory experiments. Yue Wang Webster, vice president of research and development informatics at Lilly, explained that the system allows scientists to simulate and evaluate billions of molecular hypotheses in a “dry lab” environment before committing to physical experiments, effectively breaking the “physical limit” of traditional research. Lilly employees can also use LillyPod to build chatbots, agentic workflows, and research lab agents.

“This machine is exactly how AI should be used,” said Fuchs. “It should be used for science. It should be used to lessen suffering and improve the human condition.”

Lilly will present further details about its collaboration with NVIDIA and a planned co-innovation AI lab at the upcoming NVIDIA GTC conference.

February 27, 2026 0 comments
0 FacebookTwitterPinterestEmail
Technology

NVIDIA DGX Spark & DGX Station Power Open-Source and Frontier AI Models on Desktop

by Rachel Kim – Technology Editor January 27, 2026
written by Rachel Kim – Technology Editor

Key Takeaways: NVIDIA DGX Spark ⁢& the Rise of Local AI

This article highlights the ⁤growing trend ⁣of local AI development and positions NVIDIA DGX Spark as a key enabler. Here’s ​a breakdown of the main points:

* Shift to Local⁣ AI: There’s increasing demand ⁢for secure, high-performance AI processing at⁣ the edge (on ​desktops/local ⁣machines) rather than relying solely on centralized cloud ‍infrastructure.
* ⁢ DGX​ Spark’s Role: DGX Spark is designed to power this shift,⁣ facilitating local inference, ‍agentic workflows, and Retrieval-Augmented Generation (RAG) without the complexities of cloud setups.
* Industry Validation: ⁤ Major players are adopting DGX Spark:
⁣ * Hugging Face: Demonstrates ⁣interactive ​AI⁤ agents with the Reachy Mini⁢ robot, powered by DGX Spark, allowing for real-world interaction. They’ve released a guide for building these agents.
* IBM: Highlights the “openrag on Spark”⁢ solution, providing a complete​ RAG stack locally.
‍* JetBrains: Offers petaflop-class AI performance to its customers, supporting various ‍deployment models (cloud, on-premise, hybrid).
*‍ TRINITY (micromobility vehicle): Uses DGX Spark ⁤as its AI brain for real-time vision ‍language model workloads. will.i.am describes​ it as “brains on wheels.”
* Benefits of Local AI (enabled by DGX Spark):

* Faster Iteration: Quicker⁢ development cycles.
* Greater Control: Enhanced control over data and intellectual property.
⁤ *‌ New Interactive Experiences: More⁣ engaging and⁤ responsive AI‍ applications.
* Developer Resources: NVIDIA is providing DGX Spark‌ playbooks to‌ help ⁣developers quickly⁣ start ‌AI projects, with new and updated ⁣resources⁣ being‍ released at⁣ CES. These cover topics like NVIDIA⁤ Nemotron⁢ 3 Nano, ⁤robotics, vision language models, ⁢and⁣ fine-tuning.

In essence,⁤ the article paints DGX Spark as ⁢a crucial component in democratizing⁢ AI‌ development by bringing powerful AI capabilities directly to developers’ desktops and enabling innovative applications across various ‍industries.

January 27, 2026 0 comments
0 FacebookTwitterPinterestEmail
Technology

NVIDIA DGX SuperPOD Sets the Stage for Rubin-Based Systems

by Rachel Kim – Technology Editor January 26, 2026
written by Rachel Kim – Technology Editor

NVIDIA Rubin Platform: Key Features & Components – Summary

Here’s a breakdown of the key features and components of the NVIDIA Rubin platform,based on the provided text:

Core Innovations of​ the Rubin Platform:

* Rubin GPU: the foundation of the platform,offering significant performance ⁢improvements.
* NVLink 4: Next-generation nvlink providing 1.8TB/s bi-directional bandwidth.
* Fifth-Generation NVLink: ⁢ Enables GPU-to-GPU and GPU-to-CPU interaction.
* ​ Second-Generation Transformer​ Engine: Accelerates large language model (LLM) training and inference with features like FP8 precision and accelerated ‍compression.
* ⁤ Third-Generation‌ NVIDIA Confidential Computing: ​ Secures data across CPU, GPU, and⁢ NVLink domains.
* Second-Generation RAS Engine: Provides real-time health monitoring, fault tolerance, and ⁢proactive maintenance.

DGX SuperPOD​ Components (Scale-Out):

* DGX Vera Rubin NVL72 or DGX Rubin NVL8 systems‌ (the core compute units)
* NVIDIA BlueField-4 DPUs: For secure, software-defined infrastructure.
* NVIDIA⁤ Inference Context Memory Storage Platform: For next-generation inference.
* NVIDIA ⁤ConnectX-9 SuperNICs: High-performance network interface cards.
* NVIDIA Quantum-X800 InfiniBand & NVIDIA⁢ Spectrum-X Ethernet: Networking solutions.
* NVIDIA Mission Control: Automated AI infrastructure orchestration and⁤ operations.

Specific System Details:

* DGX Vera Rubin NVL72:

* 576 Rubin GPUs (8 systems unified)
* 28.8 exaflops ‍of FP4 performance
* 600TB of fast memory
* 36 Vera CPUs, 72 Rubin GPUs, 18 BlueField-4 DPUs per system
​* 260TB/s ⁢aggregate NVLink throughput
* DGX Rubin NVL8:

* 512 Rubin GPUs‌ (64 systems)
* 5.5x‌ NVFP4 FLOPS compared ‌to Blackwell systems
* Liquid-cooled, x86 CPU based.

Networking for AI Factories:

* NVIDIA Spectrum-6 Ethernet switches

* ‌ NVIDIA Quantum-X800 infiniband switches

* BlueField-4 DPUs

* ConnectX-9 SuperNICs

Key Benefit:

* Up ⁣to 10x reduction in ‍inference token cost compared to the previous generation.

In essence, the Rubin platform is designed to be a highly scalable, secure, and efficient infrastructure‌ for running⁢ demanding AI workloads, notably large language⁢ models. It focuses ⁤on ⁤eliminating bottlenecks in‍ compute, memory, ⁤and networking to create a ‌true “AI factory.”

January 26, 2026 0 comments
0 FacebookTwitterPinterestEmail

Search:

Recent Posts

  • Song Ping, Former Top Chinese Leader, Dies at 109

    March 4, 2026
  • WV High School Wrestling: State Tournament Preview – Cameron, Oak Glen & More

    March 4, 2026
  • Regional & National Football League Selection | France Football Matches

    March 4, 2026
  • Gnocchi Parisienne: Recipe & Wine Pairing for Airy Cheese Dumplings

    March 4, 2026
  • Matsuoka’s Instagram Live Stream Interrupted by Alarm | Gaming Incident

    March 4, 2026

Follow Me

Follow Me
  • Privacy Policy
  • About Us
  • Accessibility statement
  • California Privacy Notice (CCPA/CPRA)
  • Contact
  • Cookie Policy
  • Disclaimer
  • DMCA Policy
  • Do not sell my info
  • EDITORIAL TEAM
  • Terms & Conditions

@2025 - All Right Reserved.

Hosted by Byohosting – Most Recommended Web Hosting – for complains, abuse, advertising contact: contact@world-today-news.com


Back To Top
World Today News
  • Home
  • News
  • World
  • Sport
  • Entertainment
  • Business
  • Health
  • Technology
World Today News
  • Home
  • News
  • World
  • Sport
  • Entertainment
  • Business
  • Health
  • Technology
@2025 - All Right Reserved.

Hosted by Byohosting – Most Recommended Web Hosting – for complains, abuse, advertising contact: contact@world-today-news.com