What would your ideal robot be like? One that can change nappies and tell bedtime stories to your child? Perhaps you’d prefer a butler that can polish silver and mix the perfect cocktail? Or maybe you’d prefer a companion that just happened to be a robot? Certainly, some see robots as a hypothetical future replacement for human carers. But a question roboticists are asking is: how human should these future robot companions be?
A companion robot is one that is capable of providing useful assistance in a socially acceptable manner. This means that a robot companion’s first goal is to assist humans. Robot companions are mainly developed to help people with special needs such as older people, autistic children or the disabled. They usually aim to help in a specific environment: a house, a care home or a hospital.
At the beginning of the 20th century, one of the first pieces of technology designed to help in a household environment was the vacuum cleaner. Since then, technology has transformed the home. Nowadays, we even have a robot that can cook. The chef robot was developed by Moley Robotics, a start-up company that won the 2015 Asia Consumer Electronics Show. The robot is said to be able to cook 2,000 different meals.
Pepper, the latest robot from Aldebaran Robotics, is a good example of a humanoid robot companion. It can provide assistance in making choices, detect human facial expressions and communicate with people. Pepper can adapt its behaviour depending on its perception of a person’s mood, and in this sense we can say that Pepper cares for people. At the moment, only research institutes and Japanese residents can acquire a Pepper robot. The robot costs around £8,710 for a Japanese customer.
Companion robots can take the form of pets, too. Paro is a robotic seal developed to provide comfort to old people. And rather than taking care of you, this robot has to be taken care of. It is how Paro provides emotional support.
Sometimes people get attached to robots that are not actually made for companionship. Take Roomba, for example, the intelligent vacuum cleaner. In their studies, Ja-Young Sung and colleagues from the Georgia Institute of Technology found that people wanted to become tidier in order to allow the vacuum cleaner to run smoothly.
Although many of these robots show some form of initiative and encourage people to interact with them, many are responsive rather than active – in other words, the robot waits for a human request before acting.
Should robots be more ‘human’?
Thanks to progress in Artificial Intelligence and technology, we can now develop more intelligent systems that are capable of acting very much like a human. Last year, a few of them were presented to the public, such as Nadine, the robot receptionist, Yangyang, the singer robot, and Aiko Chihira, the robot that can communicate in sign language.
Although the popular and controversial Turing test is used in AI to measure whether a machine is as intelligent as a human, it is a very different thing when it comes to robots, since robots are also expected to act intelligently. There is not yet a standardised test to determinate how human a robot is. It may come in the near future. However, all robotic researchers seem to agree that the robot would have to be able to show some social awareness and personality, and be capable of understanding and recognising people’s speech and expressions.
But do we want robots to have more personality and to be able to take more initiative? Ultimately, to act more like us? Some may argue, yes. If intelligent vacuum cleaners were able to differentiate a sleeping human from objects, for example, at least one unfortunate lady in South Koreawouldn’t have had her hair “eaten” by her new domestic appliance.
But others argue that it is dangerous to give robots too much intelligence. And would it allow them to answer back? We are still at the beginning of research regarding the potential consequences this might have. Indeed, the scientific community is still debating whether a robot can ever have feelings or be self-conscious. Although AI has been able to perform certain tasks extremely skillfully, for example Alpha Go, the community is still a long way from developing an AI which closely resembles the human mind.
At present, robot companions are either focused on companionship or on task-execution. Jibo, for example, is a social robot that can talk, order food, remind you of things, or take pictures, while Roomba is an intelligent, but ultimately functional, vacuum cleaner.
Who is in charge?
But it’s about striking the right balance, depending on the job at hand and the person it is working for. Our most recent study, Who is in charge? Sense of control and Robot anxiety in Human Robot Interaction, showed that the more controlling and anxious about robots a person is, the more initiative they expect the robot to show and the more willing they are to delegate tasks to it. The research focused specifically on what level of initiative people preferred their robot companion to have when executing a cleaning task.
Participants could choose between manually turning on the cleaning robot themselves, having their robot companion turn on the cleaning robot remotely when instructed, or having the robot companion turn on the cleaning robot when it noticed that cleaning needed to be done. It was found that most people wanted their robot companion to execute the task without being asked.
This paradoxical result may be explained by the fact that people are now more used to technology – from computers and smartphones to smartwatches and intelligent home appliances – acting semi-autonomously. Smart companion robots are just the next step in the long evolution of our relationship with technology.
In the future, it is likely that we will see more domestic robot companions that can be customised to people’s individual preferences. And we will be able to shop for them as we now shop for vacuum cleaners and phones. Ultimately, it seems, there will be a robot for everyone.
- PhD Researcher in Human Robot Interaction, University of Hertfordshire
- This article first appeared on The Conversation