Nvidia has started working on advanced humanoid robots

Nvidia has launched the Humanoid Robot Developer program for advanced humanoid robots. The program will provide a range of services, models, and computing platforms to develop humanoid robots.

Nvidia recently announced that it is providing a range of services, models, and computing platforms to the world’s leading robot manufacturers, AI model developers, and software producers to accelerate the development, training, and construction of next-generation humanoid robots.

Nvidia’s touch on humanoid robots

Nvidia will offer these tools through its Humanoid Robot Developer program. The tools provided by the company include new Nvidia NIM microservices and frameworks for robot simulation and learning, the Nvidia OSMO orchestration service for running multi-stage robotic workloads, and an AI and simulation-enabled teleoperation workflow that allows developers to train robots using a small amount of human data.

Thanks to NIM microservices, developers will be able to reduce deployment times from weeks to minutes. The company will offer two microservices to enhance various workflows and effectively utilize AI. The MimicGen NIM microservice will generate synthetic motion data based on teleoperation data recorded from spatial computing devices like Apple Vision Pro. The Robocasa NIM microservice will create robot tasks and simulation-ready environments in OpenUSD, a universal framework for development and collaboration in 3D environments.

Nvidia OSMO will be a cloud-native managed service that allows users to orchestrate and scale complex robotic development workflows across distributed computing resources, whether on-premises or in the cloud. This will significantly simplify robot training and simulation workflows, reducing deployment and development cycle times from months to a week.

Shortening development and training processes

Training humanoid robots requires models to be trained on vast amounts of data, with teleoperation being actively used in these training processes. Essentially, human movements are transferred to the robots. However, as humanoid robots advance, the use and cost of teleoperation increase. In this context, Nvidia’s AI and Omniverse-enabled teleoperation workflow will provide a significant solution for researchers and AI developers by generating large amounts of synthetic motion and perception data from a minimal amount of human demonstration captured remotely.

There are already companies using Nvidia’s platforms. Fourier, a general-purpose robot platform company, uses simulation technology to generate training data synthetically. Additionally, Nvidia states that developers will gain early access to the latest versions of Nvidia Isaac Sim, Nvidia Isaac Lab, Jetson Thor, and Project GR00T general-purpose humanoid base models through the Humanoid Robot Developer program. According to the announcement, the first participants in the early access program include 1x, Boston Dynamics, ByteDance Research, Field AI, Figure, Fourier, Galbot, LimX Dynamics, Mentee, Neura Robotics, RobotEra, and Skild AI.

Scroll to Top
sohpet islami sohbetler omegle tv türk sohbet dini chat cinsel sohbet tıkanıklık açma galeri yetki belgesi nasıl alınır yalama taşı bets10 giriş