Nvidia Launches Robot Foundation Models and Edge Hardware to Dominate Generalist Robotics
By admin | Jan 05, 2026 | 3 min read
At CES 2026, Nvidia unveiled a comprehensive suite of robot foundation models, simulation tools, and edge hardware, marking a clear ambition to establish itself as the foundational platform for general-purpose robotics. This mirrors the role Android plays for smartphones. The company's expansion into robotics aligns with a wider industry trend where artificial intelligence is transitioning from the cloud into physical machines. This shift is made possible by more affordable sensors, sophisticated simulation technology, and AI models that are increasingly capable of generalizing their learning across various tasks.
On Monday, the company detailed its full-stack ecosystem for physical AI. This includes new open foundation models designed to enable robots to reason, plan, and adapt across numerous tasks and environments, moving beyond single-purpose automation. All these models are accessible on Hugging Face. The specific releases are Cosmos Transfer 2.5 and Cosmos Predict 2.5, which are world models for generating synthetic data and evaluating robot policies in simulation; Cosmos Reason 2, a reasoning vision language model (VLM) that allows AI systems to perceive, comprehend, and interact with the physical world; and Isaac GR00T N1.6, a next-generation vision language action (VLA) model specifically engineered for humanoid robots. GR00T utilizes Cosmos Reason as its core intelligence, enabling whole-body control for humanoids so they can simultaneously move and manipulate objects.
Also introduced at CES was Isaac Lab-Arena, an open-source simulation framework hosted on GitHub. This serves as another key component of Nvidia's physical AI platform, allowing for the safe virtual testing of robotic capabilities. The platform aims to solve a major industry challenge: as robots learn more complex skills, from delicate object handling to intricate cable installation, validating these abilities in the real world is often expensive, slow, and hazardous. Isaac Lab-Arena addresses this by bringing together resources, task scenarios, training tools, and established benchmarks like Libero, RoboCasa, and RoboTwin into a single, unified standard where none previously existed.
Supporting this entire ecosystem is Nvidia OSMO, an open-source command center that acts as connective infrastructure. It integrates the complete workflow from data generation through model training across both desktop and cloud environments. Powering this stack is the new Blackwell-powered Jetson T4000 graphics card, the latest addition to the Thor family. Nvidia positions it as a cost-effective on-device computing upgrade, delivering 1200 teraflops of AI performance and 64 gigabytes of memory while operating efficiently at just 40 to 70 watts.
Furthermore, Nvidia is strengthening its collaboration with Hugging Face to democratize robot training, removing the need for expensive hardware or specialized expertise. This partnership integrates Nvidia's Isaac and GR00T technologies into Hugging Face's LeRobot framework, effectively connecting Nvidia's community of 2 million robotics developers with Hugging Face's 13 million AI builders. The open-source Reachy 2 humanoid on the developer platform now works directly with Nvidia's Jetson Thor chip, allowing developers to experiment with various AI models without being confined to proprietary systems.
The overarching strategy is clear: Nvidia aims to make robotics development more accessible and aspires to be the essential hardware and software provider powering the industry, similar to Android's role for smartphone manufacturers. Early indicators suggest this approach is gaining traction. Robotics is currently the fastest-growing category on Hugging Face, with Nvidia's models leading in downloads. Major robotics firms, including Boston Dynamics, Caterpillar, Franka Robots, and NEURA Robotics, are already utilizing Nvidia's technology.
Comments
Please log in to leave a comment.
No comments yet. Be the first to comment!