November 14, 2024


In 2013, American robotics company Boston Dynamics unveils his new robotAtlas. Unveiled at the Darpa Robotics Challenge, the 6ft 2in humanoid could walk on uneven ground, jump off boxes and even climb stairs. It was like a vision often depicted in fiction: a robot designed to work like us, capable of all kinds of everyday tasks. It seemed like the dawn of something. Robots are going to do all our boring and laborious tasks and act as elderly care workers to boot.

Since then, we’ve seen leaps forward in artificial intelligence (AI), from computer vision to machine learning. The recent wave of large language models and generative AI systems opens up new opportunities for human-computer interaction. But outside of research labs, physical robots remain largely confined to factories and warehouses, performing very specific tasks, often behind a safety cage. Home robots are limited to vacuum cleaners and lawnmowers – not exactly Rosie the robot.

“Robotic bodies haven’t evolved significantly since the 1950s,” says Jenny Read, director of the robotics program at the Advanced Research and Invention Agency (Aria), the UK government’s research and development body, which was set up last year. “I’m not saying there hasn’t been any progress, but if you look at what’s happened in computers and software, it’s really striking how little there has been.”

Developing a robot simply requires more resources, says Nathan Lepora, a professor of robotics and AI at the University of Bristol. A talented individual with a computer can write an algorithm, but building a robot requires access to the physical device. “It’s a lot slower, and it’s a lot harder,” he says. “This is essentially why robotics lags behind AI.”

Tesla’s Optimus folds a shirt. Photo: @elonmusk/X

Research labs and companies are hoping to bridge this gap, with a slew of new humanoid robots in development and some starting to hit the market. Boston Dynamics retired its original hydraulic Atlas model in April and unveiled a new, electric version that it aims to commercialize in the next few years and will begin testing in Hyundai factories next year. Oregon-based Agility Robotics claims its Digit robot is the first humanoid to actually be paid for a job, move boxes in a logistics facility. Elon Musk is pushing for Tesla’s humanoid robot, known as Optimus or Tesla Bot, to take off working in his car factories next year.

But there is still a long way to go before we see robots operating outside of tightly controlled environments. Advances in AI can only take us so far with current hardware, says Read – and for many tasks, a robot’s physical capabilities are critical. Generative AI systems can write poetry or make pictures, but they can’t do the dirty and dangerous tasks we most want to automate. For those, you need more than a brain in a box.


A useful robot design often starts with hands. “A lot of the use cases for robots really depend on them being able to handle things precisely and skillfully without damaging the object,” says Read. People are very good at this. We can instinctively switch between lifting a dumbbell to handling an eggshell, or from chopping a carrot to stirring a sauce. We also have excellent tactile sense, demonstrated by our ability to read Braille. In comparison, robots struggle. Read his Aria programbacked by £57m of funding, is focused on this problem.

One of the challenges of robotics is scale, says Rich Walker, director of London-based Shadow Robot. In the company’s office in Camden, he shows the Shadow Dexterous Hand. It is the size of a man’s hand, with four fingers and a thumb, and joints imitating knuckles. But while the digits look fancy, the hand is attached to a robotic arm much wider than a human forearm, packed with electronics, cables, actuators and everything else needed to operate the hand. “It’s a suit problem,” Walker says.

An advantage of a human-scale hand is that it is the right size and shape to handle human tools. Walker gives the example of a laboratory pipette, which he modified with Sugru, a moldable glue, to make it more ergonomic. You can attach a pipette tool directly to a robot hand, but then it will only be able to use a pipette and not, for example, scissors or a screwdriver.

But a completely human-like hand is not the best for every task. Shadow Robot’s most recent hand, DEX-EE, looks quite alien. It has three digits, more like thumbs than fingers, that are notably larger than a human’s and covered in tactile sensors. The company designed it in collaboration with Google DeepMindAlphabet’s AI research lab, which wanted a robotic hand that could learn how to pick things up by repeatedly trying to do so — a trial-and-error approach known as reinforcement learning. But this posed challenges: robot hands are usually designed expressly not to fall into things, and tend to break if they do. Murilo Martins, a DeepMind research engineer, says that when he ran experiments with the original Dexterous Hand, “every half hour a tendon would break”.

DEX-EE prioritizes robustness: a video programs the three figures happily open and close while being hit by a hammer. Its larger size accommodates larger pulleys, which puts less stress on the wire tendons, meaning it can work reliably for at least 300 hours.

Boston Dynamics’ hydraulic humanoid Atlas. Photo: Darpa

Still, says Maria Bauza, a DeepMind research scientist, time with the robot is precious. Last week, DeepMind published research in which a new training method it calls DemoStart. It takes the same trial-and-error approach, but starts by using a simulated robotic hand instead of a real one. After training the simulated hand to complete tasks such as tightening a nut and bolt, the researchers transferred this learned behavior to the real DEX-EE hand. “The hands have gone through thousands and thousands of experiments,” says Bauza. “It’s just that we don’t make them start over.”

This reduces the time and cost of conducting experiments, making it easier to train robots that can adapt to different tasks. However, the skills are not always transferred perfectly; while DeepMind’s simulated robotic hand was able to place a plug in a socket 99.6% of the time, the real hand managed to do so only 64% of the time.

The work is an example of how developments in AI and robotic bodies go hand-in-hand. Only through physical interactions can robots truly make sense of their environment. Read points out that the large language models behind text generators such as ChatGPT are after all trained on a large corpus of human language shared on the internet, “but where do I get the data about how it feels to pick a strawberry or wrap a sandwich make?”

As the DeepMind robotics team write: “A large language model can tell you how to tighten a bolt or tie your shoes, but even if it were embodied in a robot, it wouldn’t be able to perform those tasks by itself.”

Martins goes a step further. He believes robotics is critical to achieving artificial general intelligence (AGI), the broad, human-equivalent intelligence that many AI researchers dream of. He argues that an AI can only really understand our world if it has a physical form. “To me, AGI doesn’t exist without embodiment, much in the same way that human intelligence doesn’t exist without our own bodies,” he says.


Hands, while important, are just one body part. While Shadow Robot and others focus on fingers, an increasing number of companies and labs are developing full humanoids.

The appeal of humanoids may be partly psychological. “This is the robot we’ve all been waiting for – it’s like C3PO,” says Walker. But there is also a logic to using the human form as a muse. “We’ve designed all of our environments around people,” says Jonathan Hurst, Agility Robotics’ co-founder and chief robotics officer. “So having a more or less human form factor is a very good way to be able to move and manipulate and co-exist with humans.”

But a humanoid may not be the best design for every job. A wheeled robot can go anywhere a wheelchair user can, and when it comes to more difficult terrain, four legs may be better than two. Boston Dynamics’ canine Spot can scramble over rough ground or stairs and self-righting if it falls – something bipedal robots struggle with. “Just because a humanoid robot takes a similar shape to a human doesn’t mean it has to move that way and be limited by the limitations of our joints,” adds a Boston Dynamics spokesperson via email.

Agility Robotics’ figure. Photo: agilityrobotics.com

For now, humanoids are still finding their feet. Flashy videos and sleek designs can give people an unrealistic sense of how competent or trustworthy they are, says Bristol University’s Lepora. Boston Dynamics’ tracks are impressive, but the company is also known for them blooper reels show his robot fails. In January, Musk shared a video of Optimus folding a shirt – but sharp-eyed viewers saw telling signs that the robot was teleoperated.

A major challenge in bringing robots out of laboratories and industrial environments and into homes or public spaces is safety. In June, the Institute of Electrical and Electronics Engineers (IEEE) launched a study group to examine standards specifically for humanoid robots. Aaron Prather, the group’s chairman, explains that a humanoid in a shared space is a different proposition than an industrial robot encased in protective cage. “It’s one thing for them to interact with a co-worker at an Amazon facility or a Ford factory because that’s a trained worker working with that robot,” he says. “[But if] I put that robot out in the public park, how will it interact with children? How is it going to deal with people who don’t understand what’s going on?”

Hurst envisages robots in the retail sector as a next step, stocking shelves or working in back rooms. Prather believes we will soon see robots waiting on tables. However, for many applications it may not make financial sense to use a robot. Walker gives the example of a delivery robot. “It has to be cost-effective [compared] with someone on a minimum wage, zero hours contract on an e-scooter,” he says.

Most of the robotics experts I spoke to said a multipurpose home robot—the kind that can wash your dishes, wash your laundry, and walk your dog—is a ways off. “The era of a usable humanoid is here, but the road to a truly general purpose humanoid robot will be long and difficult and is many years away,” says Boston Dynamics. Care robots, often seen as the solution to aging populations, will be a particularly difficult prospect, says Read. “Let’s get to the point where a robot can reliably take apart a laptop or make you a sandwich, and then we’ll think about how it can take care of an elderly person,” she says. That is if we even want robots to take care work. Just like art and poetry, some roles may still be best with a human touch.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *