With recent advances in artificial intelligence and robotics technology, there is growing interest in developing and marketing household robots capable of handling a variety of domestic chores.
Tesla is building a humanoid robot, which, according to CEO Elon Musk, could be used for cooking meals and helping elderly people. Amazon recently acquired iRobot, a prominent robotic vacuum manufacturer, and has been investing heavily in the technology through the Amazon Robotics program to expand robotics technology to the consumer market. In May 2022, Dyson, a company renowned for its power vacuum cleaners, announced that it plans to build the U.K.’s largest robotics center devoted to developing household robots that carry out daily domestic tasks in residential spaces.
Despite the growing interest, would-be customers may have to wait awhile for those robots to come on the market. While devices such as smart thermostats and security systems are widely used in homes today, the commercial use of household robots is still in its infancy.
As a robotics researcher, I know firsthand how household robots are considerably more difficult to build than smart digital devices or industrial robots.
Handling objects
One major difference between digital and robotic devices is that household robots need to manipulate objects through physical contact to carry out their tasks. They have to carry the plates, move the chairs and pick up dirty laundry and place it in the washer. These operations require the robot to be able to handle fragile, soft and sometimes heavy objects with irregular shapes.
The state-of-the-art AI and machine learning algorithms perform well in simulated environments. But contact with objects in the real world often trips them up. This happens because physical contact is often difficult to model and even harder to control. While a human can easily perform these tasks, there exist significant technical hurdles for household robots to reach human-level ability to handle objects.
Robots have difficulty in two aspects of manipulating objects: control and sensing. Many pick-and-place robot manipulators like those on assembly lines are equipped with a simple gripper or specialized tools dedicated only to certain tasks like grasping and carrying a particular part. They often struggle to manipulate objects with irregular shapes or elastic materials, especially because they lack the efficient force, or haptic, feedback humans are naturally endowed with. Building a general-purpose robot hand with flexible fingers is still technically challenging and expensive.
It is also worth mentioning that traditional robot manipulators require a stable platform to operate accurately, but the accuracy drops considerably when using them with platforms that move around, particularly on a variety of surfaces. Coordinating locomotion and manipulation in a mobile robot is an open problem in the robotics community that needs to be addressed before broadly capable household robots can make it onto the market.
They like structure
In an assembly line or a warehouse, the environment and sequence of tasks are strictly organized. This allows engineers to preprogram the robot’s movements or use simple methods like QR codes to locate objects or target locations. However, household items are often disorganized and placed randomly.
Home robots must deal with many uncertainties in their workspaces. The robot must first locate and identify the target item among many others. Quite often it also requires clearing or avoiding other obstacles in the workspace to be able to reach the item and perform given tasks. This requires the robot to have an excellent perception system, efficient navigation skills, and powerful and accurate manipulation capability.
For example, users of robot vacuums know they must remove all small furniture and other obstacles such as cables from the floor, because even the best robot vacuum cannot clear them by itself. Even more challenging, the robot has to operate in the presence of moving obstacles when people and pets walk within close range.
Keeping it simple
While they appear straightforward for humans, many household tasks are too complex for robots. Industrial robots are excellent for repetitive operations in which the robot motion can be preprogrammed. But household tasks are often unique to the situation and could be full of surprises that require the robot to constantly make decisions and change its route in order to perform the tasks.
Think about cooking or cleaning dishes. In the course of a few minutes of cooking, you might grasp a sauté pan, a spatula, a stove knob, a refrigerator door handle, an egg and a bottle of cooking oil. To wash a pan, you typically hold and move it with one hand while scrubbing with the other, and ensure that all cooked-on food residue is removed and then all soap is rinsed off.
There has been significant development in recent years using machine learning to train robots to make intelligent decisions when picking and placing different objects, meaning grasping and moving objects from one spot to another. However, to be able to train robots to master all different types of kitchen tools and household appliances would be another level of difficulty even for the best learning algorithms.
Not to mention that people’s homes often have stairs, narrow passageways and high shelves. Those hard-to-reach spaces limit the use of today’s mobile robots, which tend to use wheels or four legs. Humanoid robots, which would more closely match the environments humans build and organize for themselves, have yet to be reliably used outside of lab settings.
A solution to task complexity is to build special-purpose robots, such as robot vacuum cleaners or kitchen robots. Many different types of such devices are likely to be developed in the near future. However, I believe that general-purpose home robots are still a long way off.
Ayonga Hereid, Assistant Professor of Mechanical and Aerospace Engineering, The Ohio State University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
0 comments:
Post a Comment