Google is teaching robots to think for themselves

I’m standing in a kitchenette at a Google office in Mountain View, California, observing a robot at work. It’s staring at items on a counter: bubbly water, a bag of whole-grain chips, an energy drink, a protein bar. After what seems like forever, it extends its arm, grabs the chips, rolls a few yards away, and drops the bag in front of a Google employee. “I am done,” it declares.

This snack delivery, which the bot performed during a recent press briefing, might not seem like a particularly amazing feat of robotics, but it’s an example of the progress Google is making in teaching robots how to be helpful—not by programming them to perform a set of well-defined tasks, but by giving them a broader understanding of what humans might ask for and how to respond. That’s a far more demanding AI challenge than a smartphone assistant such as the Google Assistant responding to a limited, carefully worded set of commands.

    <script type="text/javascript" src="https://content.jwplatform.com/libraries/G2hQKLvX.js"></script>
        <div id="jwplayer_yIB9kXw4_G2hQKLvX_div"></div>
    <script type="text/javascript">
        if(typeof(jQuery)=="function"){(function($){$.fn.fitVids=function(){}})(jQuery)};
            jwplayer('jwplayer_yIB9kXw4_G2hQKLvX_div').setup(
            {"playlist":"https:\/\/content.jwplatform.com\/feeds\/yIB9kXw4.json","ph":2}
        );
    </script>

The robot in question has a tubular white body, a grasping mechanism on the end of its single arm, and wheels. The fact that it’s got cameras where we have eyes gives it a certain anthropomorphism, but mostly, it seems engineered for practical functionality. It was created by Everyday Robots, a unit of Google’s parent company Alphabet. Google has been collaborating with its robot-centric corporate sibling on the software side of the challenge of making robots useful. This research is still early and experimental; along with tasks such as finding and fetching items, it also includes training bots to play ping-pong and catch racquetballs.

A squadron of Everyday Robots bots hanging out at Google’s robotics research lab. [Photo: Harry McCracken]
And now Google is sharing news about its latest milestone in robot software research, a new language model called PaLM-SayCan. (The “PaLM” stands for “Pathways Language Model.”) Developed in collaboration with Everyday Robots, this software provides company’s robots with a broader understanding of the world that helps them respond to human requests such as, “Bring me a snack and something to wash it down with” and, “I spilled my drink, can you help?” That requires understanding spoken or typed statements, teasing out the ultimate goal and breaking it into steps, and accomplishing them using whatever skills a particular robot might have.

Google’s software breaks down human requests into component parts and decides what steps to take given a particular robot’s capabilities. [Image courtesy of Google]
According to Google, its current PaLM-SayCan research is the first time that robots have had access to a large-scale language model. Compared to previous software, the company says, PaLM-SayCan makes robots 14% better at planning jobs and 13% better at successfully completing them. Google has also seen a 26% improvement in robots’ ability to plan tasks involving eight or more steps—such as responding to “I left out a soda, an apple, and water. Can you throw them away and then bring me a sponge to wipe the table?”
For you or me, choosing from several snacks on a counter is easy. For robots, it remains an achievement of note. [Photo: Courtesy of Google]

Not coming soon to a home near you

Though Everyday Robots’s bots have been performing useful work such as sorting trash at Google offices for a while now, the whole effort is still about learning how to teach the bots to teach themselves. In the demos we saw at the recent press briefing, the robot performed its snack-retrieval duties so slowly and methodically that you could practically see the wheels whirring in its head as it figured out the job step by step. As for the ping-pong and racquetball research, it’s not that Google sees a market for athletic robots, but these activities require both speed and precision, making them good proxies for all kinds of actions that robots will need to learn how to handle.

        <div id="jwplayer_4q6VSDfQ_G2hQKLvX_div"></div>
    <script type="text/javascript">
        if(typeof(jQuery)=="function"){(function($){$.fn.fitVids=function(){}})(jQuery)};
            jwplayer('jwplayer_4q6VSDfQ_G2hQKLvX_div').setup(
            {"playlist":"https:\/\/content.jwplatform.com\/feeds\/4q6VSDfQ.json","ph":2}
        );
    </script>

Google’s emphasis on robotic ambition over getting something on the market right away stands in contrast to the strategy followed by Amazon, which is already selling Astro, a $999 home robot, on an invite-only basis. In its current form, Astro only does a few things and doesn’t amount to that much more than an Alexa/security camera gadget on wheels; when my colleague Jared Newman tried one at home, he struggled to find uses for it.

Google Research robotics lead Vincent Vanhoucke told me that the company isn’t yet at the point where it’s trying to develop a robot for commercial release. “Google tries to be a company that focuses on providing access to information, helping people task in their daily lives,” he says. “You could imagine a ton of overlap between Google’s overarching mission and what we’re doing in terms of more concrete goals. I think we’re really at the level of providing capabilities, and trying to understand what capabilities we can provide. It’s still a quest of ‘what are the things that the robot can do? And can we broaden our imagination about what’s possible?'”

In other words, don’t assume you’ll be able to buy a Google robot anytime soon—but stay tuned.

https://www.fastcompany.com/90779334/google-robot-research-everyday-robots-palm-saycan?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Created 3y | Aug 16, 2022, 2:21:14 PM


Login to add comment

Other posts in this group

TikTok’s ‘airport theory’ dares you to arrive just 15 minutes before your flight

When it comes to airports, travelers tend to fall into two camps. There are the anxious types who show up four hours early, with plenty of time to leisurely peruse duty-free and enjoy the airport

Mar 4, 2025, 4:10:09 PM | Fast company - tech
Uber will now pair Austin riders with Waymo self-driving cars

Starting today, if you call an Uber in Austin you can match with a self-driving Waymo vehicle. 

The launch in the Texas capital is part of an expanded partnership between the two te

Mar 4, 2025, 4:10:08 PM | Fast company - tech
Used Tesla prices depreciated more than any other automaker in 2024

The price of used Tesla’s Model 3 and Model Y vehicles depreciated more than any other cars in 2024, according to a Fast Company analysis of CarGurus data.

The average pri

Mar 4, 2025, 4:10:07 PM | Fast company - tech
Mozilla’s new message: We’re the only browser not backed by billionaires

As frustration with corporate power grows under the oligarch-friendly Trump administration, Mozilla Firefox

Mar 4, 2025, 4:10:05 PM | Fast company - tech
The third coming of Microsoft AI CEO Mustafa Suleyman

Microsoft employees stream down a hallway by the dozen, smartphones and paper coffee cups in hand, many clad in heavy coats on this frigid February morning. The setting is idyllic—Lake Washington

Mar 4, 2025, 11:30:03 AM | Fast company - tech
Popular female streamers are bringing on security after stars were attacked during an IRL stream

Female streamers are being told to hire security after a spate of recent attacks. 

Popular Twitch stars Valkyrae, Cinna, and Emiru were out in public at the Santa Monica Pier on Mar

Mar 4, 2025, 6:50:02 AM | Fast company - tech