close
close

Association-anemone

Bite-sized brilliance in every update

Researchers have tortured robots to test the limits of human empathy
asane

Researchers have tortured robots to test the limits of human empathy

In 2015, a jovial three-foot-tall robot with pool noodles for arms set off on what seemed like a simple mission. Using the kindness of strangers, this machine, called “hitchBOT” would spend months hitchhiking across the continental United States. It has only done 300 miles. Two weeks into the road, HitchBOT was found abandoned on the streets of Philadelphia, its head severed and its spaghetti arms ripped from its bucket-shaped body.

“It was a failure and we didn’t really expect it,” hitchBOT co-creator Frauke Zeller said CNN at the time.

The premature dismantling of hitchBot is not a unique case. For years, people have relished opportunities to strike, fist, tripcrush, and ran over anything remotely resembling a robot. This penchant for car violence could go from funny to potentially worrisome as a new wave of humanoid robots it is built to work alongside people in manufacturing facilities. But a growing body of research suggests that we are more likely to feel sorry for our mechanical assistants and even go easy on them if they make human-like sounds of pain. In other words, hitchBot would have done better if it had been programmed to beg for mercy.

People feel guilty when robots cry

Radboud University Nijmegen researcher Marieke Wieringa recently conducted a series of experiments looking at how people reacted when asked to violently shake a test robot. In some cases, participants shook the robot and nothing happened. At other times, the robot emits a pitiful wailing sound from a pair of small speakers or widens its “eyes” to convey sadness. The researchers say they were more likely to feel guilty when the robot gave emotion-like responses. In another experiment, participants were given the option of either performing a boring task or shaking the robot. Participants were more than willing to shake the robot when it did not respond. When he called out, however, participants chose to continue and complete the task.

“Most people had no problem shaking a silent robot, but as soon as the robot started making pitiful sounds, they chose to do the boring task instead,” Wieringa. said in a statement. Wieringa will support the research as part of her PhD thesis at Radboud University in November.

These findings build on previous research showing that we can treat robots more kindly when they appear to exhibit a range of human-like tendencies. Participants in one study, for example, were less inclined to hit a robot with a hammer if the robot had a backstory describing its supposed personality and experiences. In another case, Test subjects were friendlier to the humanoid robots after using a VR headset to “see” through the car’s perspective. Other research suggests that humans might be more willing to empathize with or trust robots who seem to recognize their own emotional state.

“If a robot can claim emotional distress, people feel more guilty when they mistreat the robot,” Wieringa added.

Many ways humans have abused bots

Humans have a long history of taking out our frustrations on inanimate objects. Whether it’s parking meters, vending machines, or broken toaster ovens, people have long found themselves bizarrely attributing a human-like hostility to everyday objects, a phenomenon writer Paul Hellweg refers to as “representationalism.” Before more modern conceptions of robots, humans could be seen attacking parking meters and shaking furiously automatic machines. As machines have become more complex, so have our methods of destruction. This penchant for destroying robots was perhaps best encapsulated in popular The 2000s television show Battle Botswhere the crowds cheered as the hastily assembled robots were repeatedly slashed, shredded and set on fire in front of a roaring crowd.

Now, with more consumer-grade robots roaming the real world, some of these exuberant attacks are taking place on city streets. Autonomous vehicles operated by Waymo and Cruise were they’ve been vandalized and had their tires slashed in recent months. A Waymo vehicle was equal burned to the ground earlier this year.

In San Francisco, local residents He apparently shot down an egg-shaped Knightscope K9 patrol robot and smeared him with feces after he was deployed by a local animal shelter to monitor homeless people. Knightscope said before Popular Science an intruder fleeing a health center intentionally hit one of its robots with his vehicle. Food delivery robots are currently operating in several cities they were also kicked and vandalized. More recently, a roughly $3,000 AI-powered sex robot showcased at a tech fair in Austria had to be sent for repairs after event attendees left her “very dirty”.

But perhaps the most famous examples of bot-powered abuse come from Hyundai-owned Boston Dynamics. The company has created what many consider some of the most advanced quadrupedal and bipedal robots in the world, in part, by subjecting them to countless hours of attack. Popular YouTube videos show Boston Dynamics engineers hitting his robot Spotand harassing her Atlas humanoid robot with weighted medicine balls and a hockey stick.

Research trying to understand real reasons Why People seeming to enjoy abusing bots was a mixed bag. In higher-stakes cases like autonomous vehicles and factory robots, these automated tools can act as a reminder of potential job losses or other economic hardships that may arise from a world marked by automation. In other cases, however, researchers such as Italian Institute of Technology cognitive neuroscientist Agnieszka Wykowska say the non-humanity of machines can trigger a strange kind of tribal anthroposophy response.

“You have an agent, the robot, that is in a different category than humans,” Wykowska said during an event in 2019. interview with New York Times. “So you probably engage very easily in this psychological mechanism of social ostracism because it’s an out-group member. It’s something to talk about: dehumanizing robots even if they’re not human.”

However, our apparent inclination to mess with robots could become more complicated as they become more integrated into public life. Makers of humanoid robots like Figure and Tesla envision a world where upright, bipedal machines work side-by-side with humans in factories, run errands, and maybe even take care of our children. All of these predictions, it’s worth noting, are still very much theoretical. The success or failure of those machines, however, may ultimately depend in part on tricking human psychology to make us empathize with a machine as we would a person.