FRANCESCO RODELLA | Tungsteno
Our trust in robots increases if they occasionally make mistakes or even try to trick us, according to a recent study. Indeed, one of the traditional obstacles these machines have encountered when they are introduced has been their ability to relate to people. In a context in which the presence of robots in society is expanding, we must learn how to live with them, highlighting the need to find innovative solutions to a complex problem: how to make them empathise with us.
On the web it is easy to find explanatory articles and videos about NAO, a little robot about the height of a child who laughs and can tell stories or launch itself into a wading pool filled with balls. Developed in the 2000s by the French company Aldebaran Robotics, it is often used to support educational and health initiatives. It is also employed by researchers from different areas of knowledge to explore the sphere of interactions between machines and humans: this has helped us to understand that creating artificial systems capable of relating to us is not just a matter for engineers.
In fact, robotics is facing a new challenge, as complicated as, or even more complex than, the spectacular advances that have been made in recent decades in artificial intelligence, robotic motor skills or even the humanlike appearance of robots. Miguel A. Salichs, professor at the Charles III University of Madrid and specialist in social robotics, the area that studies and develops robots for services such as NAO, explains that the idea is to create machines that can interact with anyone. The robot, he adds, has to be able to do this naturally, and for us what is natural is to interact as we do with humans. "To develop social robots, we need to involve psychologists, psychiatrists, therapists, geriatricians and caregivers in the user testing activities and the subsequent evaluation of the results," he explains.
Studies carried out with robots like NAO, in the image, show that the expression of emotions or the recognition of errors improves interaction with people. Credit: HRI Group.
Two conflicting viewpoints
In this regard, beyond the efficiency of robots, our trust in them depends on their ability to relate to us, on their social skills. However, when we interact with robots, our attitudes vary according to the expectations we have, as Joffrey Becker, anthropologist at the Collège de France, points out: some people can be very disappointed by their lack of autonomy. Other people, in contrast, do not want to interact with them at all. But in general, he adds, robots arouse curiosity.
Thus, in our interactions with robots, we have traditionally developed two viewpoints: either they are a help, or they represent a threat . If the opportunity to have an electronic assistant that makes our lives easier by taking on boring, unpleasant or tiring tasks is one of the aspects that most attracts us to them, "the fear that one day robots will be better than us, outperform us and then replace us in our jobs or perhaps even become the dominant population on earth, is an element that can generate scepticism or repulsion," says Aike Horstmann, a specialist in social psychology at the German University of Duisburg-Essen.
Another difficulty is related to expectations. Most people expect robots to be more advanced... and their limitations generate rejection, but at the same time, "their fears that were probably highly influenced by science-fiction scenarios such as in Terminator or I, Robot, seem to be fading," Horstmann concludes. These technological limitations mean that apparently very simple tasks, such as manipulating an object, are still enormously complex challenges for robots, says Salichs. "We have the idea that what is easy for humans to do is also easy for a machine," he says. However, in his opinion, robotics in general is in its infancy and still has a lot of problems to solve.
Beyond their appearance or abilities, the challenge is to create robots that do not have standard behavior but are capable of adapting to each person. Credit: Wikimedia Commons.
A question of trust... and adaptation
Making robots more human, capable of understanding us and conversing with us, is one of the most obvious challenges. We are still far from the scenario illustrated in the film Her, dominated by the intimate conversations between Theodore and his voice assistant, Samantha. But according to Horstmann, putting people in direct contact with the reality of today's robots is key to generating more trust on the part of users. After interacting with NAO, she notes, people are usually quite enthusiastic.
Becker highlights another perspective: "I don't think it’s that we don't trust machines," he says, "I prefer to think that sometimes we don't blindly trust those who build them, which is a good reaction." In his opinion, one of the problems in interacting with machines is evaluating their objectives . Therefore, he suggests that when you have contact with robots, it is important to ask yourself why they were created, and how we can use them, and then to test out their use accordingly.
It should also be noted, according to Horstmann, that "the size and overall appearance of robots, along with how they behave and what they tell people, make a big difference in the perception we might have when interacting with them." Therefore, the way to create social robots that are useful for everyone, says Salichs, will be to ensure that, instead of having standardised behaviour, robots should be able to adapt to the particular user, to his or her particular tastes and needs.