DeepMind's latest AI model can help robots fold origami and close Ziploc bags

Since its debut at the end of last year, Gemini 2.0 has gone on to power a handful of Google products, including a new AI Mode chatbot. Now Google DeepMind is using that same technology for something altogether more interesting. On Wednesday, the AI lab announced two new Gemini-based models it says will "lay the foundation for a new generation of helpful robots."

The first, Gemini Robotics, was designed by Deepmind to facilitate direct control of robots. According to the company, AI systems for robots need to excel at three qualities: generality, interactivity and dexterity.

The first involves a robot's flexibility to adapt to novel situations, including ones not covered by its training. Interactivity, meanwhile, encapsulates a robot's ability to respond to people and the environment. Finally, there's dexterity, which is mostly self-explanatory: a lot of tasks humans can complete without a second thought involve fine motor skills that are difficult for robots to master.

"While our previous work demonstrated progress in these areas, Gemini Robotics represents a substantial step in performance on all three axes, getting us closer to truly general purpose robots," says DeepMind.

For instance, with Gemini Robotics powering it, DeepMind's ALOHA 2 robot is able to fold origami and close a Ziploc bag. The two-armed robot also understands all the instructions given to it in natural, everyday language. As you can see from the video Google shared, it can even complete tasks despite encountering roadblocks, such as when the researcher moves around the Tupperware he just asked the robot to place the fruit inside of.

Google is partnering with Apptronik, the company behind the Apollo bipedal robot, to build the next generation of humanoid robots. At the same time, DeepMind is releasing Gemini Robotics-ER (or embodied reasoning). Of the second model, the company says it will enable roboticists to run their own programs using Gemini's advanced reasoning abilities. DeepMind is giving "trusted testers," including one-time Google subsidiary Boston Dynamics, access to the system.

This article originally appeared on Engadget at https://www.engadget.com/ai/deepminds-latest-ai-model-can-help-robots-fold-origami-and-close-ziploc-bags-151455249.html?src=rss https://www.engadget.com/ai/deepminds-latest-ai-model-can-help-robots-fold-origami-and-close-ziploc-bags-151455249.html?src=rss
Created 5h | Mar 12, 2025, 5:20:13 PM


Login to add comment

Other posts in this group

Sonos reportedly scraps its long-expected streaming video device

Sorry to everyone who was waiting for Sonos to release

Mar 12, 2025, 9:50:17 PM | Engadget
Intel names Lip-Bu Tan its new CEO

Intel has a new leader at the helm, hoping to change course after a challenging period for the chipmaker. The company

Mar 12, 2025, 9:50:15 PM | Engadget
Samsung's new March Madness TV bundle pushes the boundaries of reason — and walls

Straining the limits of wall space and most reasonable people's budgets, Samsung is now selling

Mar 12, 2025, 9:50:14 PM | Engadget
Google apologizes for Chromecast outage and promises a fix

Many people using older Chromecast devices experienced a shock in recent days when their units lost the ability to cast content. The company

Mar 12, 2025, 7:30:13 PM | Engadget