Sekėjai

Ieškoti šiame dienoraštyje

2025 m. lapkričio 29 d., šeštadienis

Humanoid Robots Have a Long Way to Go


Some of the best Chinese robots with repeating hand manipulation include the Unitree G1 and Kepler Robotics 4Runner K2, which use force-sensitive, high-dexterity hands for tasks like sorting and precision assembly. They are trained through a combination of imitation and reinforcement learning, combined with environmental generalization to learn tasks in real-world scenarios.

Best Chinese robots for manipulation

 

    Unitree G1, price from 13.5000 dollars: This humanoid robot is equipped with a "force control dexterous hand" and uses a combination of imitation and reinforcement learning for its manipulation skills. It can perform precise tasks and has been shown to do things like deal cards, pull sheets off a bed, and move light packages. (https://www.unitree.com/g1)

    Kepler Robotics 4Runner K2, price 30 000 dollars: This robot stands out for its superior hand dexterity, featuring up to 11 degrees of freedom in each hand and flexible sensors with 96 contact points in each fingertip. This allows for complex tasks like assembling small components and handling materials in logistics.

    Magic Bot: Developed by Magic Lab, price 119, 800 dollars this humanoid robot has hands with six miniature servo actuators and multi-dimensional pressure sensors. It can replicate about 70% of human hand gestures, allowing it to perform tasks from pinching to gripping, including handling fragile or slippery items.

    Dobot Atom, price 27,500 dollars: Designed for precision and adaptability, the Dobot Atom humanoid robot is optimized for dexterous manipulation, with precision up to $±0.05$mm.

 

How they are trained

 

    Imitation and reinforcement learning: Robots like the Unitree G1 are trained using imitation and reinforcement learning, where they learn by observing human actions and through trial and error.

    Environment generalization: Robots are placed in fully functioning environments (like a home or factory) to handle a sequence of tasks. This allows them to generalize their skills and adapt to unpredictable, real-world spaces rather than simply memorizing specific actions.

    Machine learning with sensor data: They rely on a combination of vision (cameras for object detection), force/torque sensors in the fingers, and machine learning models to mimic subtle human motions, like the precise pinching and sliding motions needed for tasks like dealing cards.

    Large-scale demonstration: For large-scale industrial deployments, like the one with the Agibot G2 robots in Longer's factories, robots are trained to perform specific, repetitive tasks with high precision, such as material handling and assembly. Things in the West are more complicated and discussed with a lot of skepsis:

 

 

“Humanoid robots are back. Videos of robots folding laundry or brewing espresso have flooded social-media feeds, while investors have poured billions of dollars into development. But before we imagine robot assistants in every home, it's worth remembering that we've been here before.

 

In 2000 Honda introduced Asimo, a humanoid robot that could walk, run and serve drinks. Robotics enthusiasts hailed it as the next logical step after the personal-computer revolution -- a robot for every household. More than two decades later, Asimo sits in a museum, a reminder of how far robotics still has to go.

 

Now the tech world's fascination with humanlike robots has returned -- this time fueled by artificial intelligence and investors eager to deploy capital behind what could be the next trillion-dollar opportunity. Many investors and founders believe that the same concepts that allow computers to write text and have conversations with humans can be adapted to allow humanoids to operate autonomously in warehouses, outdoor spaces or people's homes. In August, China held a humanoid robotics Olympics, where devices competed in soccer, kickboxing and even cleaning rooms. Tesla is investing heavily in the area, through Optimus. Videos on social media show the feats that humanoids can accomplish. And the venture industry has poured nearly $5 billion into humanoid startups that promise to bring down the cost of onshore manufacturing and aim to give millions of Americans a low-cost domestic helper.

 

But unlike many aspects of AI that are truly transformational, the humanoid fascination will ultimately prove to be a parlor trick with few practical applications.

 

From a technical standpoint, robotics problems can be broken down into two main categories: navigation and manipulation. Navigation involves the challenge of a robot getting from point A to point B. This may involve unknown terrain, stairs and obstacles. It could include surfaces that the robots haven't encountered before. AI has made significant advances in the field of navigation. Robots today can be easily and quickly trained to move across a warehouse floor or to navigate a rocky field.

 

As a result, there are widespread applications of robots that primarily perform a navigation task, such as self-driving cars, robots that transport items at Amazon warehouses, Roomba vacuum cleaners, automated forklifts and drones used by first responders. These applications typically use robots that can roll on wheels or fly rather than walk with two legs and feet for the same reasons that it's faster to travel by bike, car or plane than by foot.

 

Manipulation -- "the hands problem," as Elon Musk has called it -- is a far more complex issue to solve. Manipulation involves getting a robot to mimic the motions of human hands, and that's no easy task. Hands are amazing. A small child can figure out how to pick up an object, regardless of its shape, size or center of gravity. Human hands have millions of nerve endings that, when combined with visual cues, help on tasks such as washing dishes, folding clothes or opening a jar, with minimal skill or training. While traditional generative text AI could be trained on all human-written content on the internet, there's no equivalent corpus of training data for these physical tasks, nor is there a way to capture accurately the tactile or haptic feedback our hands receive during any manipulation task.

 

Tesla is using a vision-only training approach for Optimus, which involves filming workers performing tasks and using those videos to train its robots.

 

Meanwhile, robotics startups are turning to platforms like Encord and Micro1 that provide labeled video data to train their AI systems. The large foundation labs including OpenAI are fueling demand for high-quality training data -- often gathered by real people recording themselves doing everyday tasks like folding laundry, loading dishwashers or making coffee. Yet these training attempts provide only a small subset of the data needed. Further, unlike generative AI chatbots, these humanoids must operate in the physical world and deal with changing light, temperature, humidity and other variables.

 

Robots have important roles to play in our future. Industrial companies have been adopting wheeled robots, and police departments are using drones. Automation in general remains critical to ensure that America reshores manufacturing and stays competitive with the rest of the world. In other critical industries, such as national defense, wheeled and flying robots are being used for surveillance, mine detection and offensive capabilities.

 

But tasks requiring any kind of manipulation will remain human-centric for the foreseeable future. Dentists, surgeons, house cleaners, cooks, HVAC and electrical contractors will all be well-protected from the many advances in AI.

 

Humanoid robots make for great demonstration shows and even better headlines. But the real frontier of robotics isn't about building machines that look like us -- it's about designing ones that quietly make our industries more efficient. For now, the most human thing about humanoid robots may be our tendency to overestimate them.

 

---

 

Mr. Agarwal is a partner at Bain Capital Ventures.” [1]

 

1. Humanoid Robots Have a Long Way to Go. Agarwal, Ajay.  Wall Street Journal, Eastern edition; New York, N.Y.. 29 Nov 2025: A13.  

Komentarų nėra: