Robots are coming to take our jobs — and trick our gullible children.

The conventional wisdom has long been afraid of the first half of that sentence, and now a study published Wednesday in the journal Science Robotics speaks to the second half. The study paired a group of kids with cute, humanoid robots who constantly gave an incorrect answer to a simple test, which the led the kids to quite often — well, follow the robots’ incorrect lead.

According to an abstract of the study, “People are known to change their behavior and decisions to conform to others, even for obviously incorrect facts. Because of recent developments in artificial intelligence and robotics, robots are increasingly found in human environments, and there, they form a novel social presence. It is as yet unclear whether and to what extent these social robots are able to exert pressure similar to human peers.”

The testers, however, used a group of children ages 7 to 9 and found that they generally conform to the robots in the study. “This raises opportunities as well as concerns for the use of social robots with young and vulnerable cross-sections of society; although conforming can be beneficial, the potential for misuse and the potential impact of erroneous performance cannot be ignored.”

To be sure, you can argue about how much value to ascribe to this, since kids of a certain age will to a degree go along with anyone who’s older than them. Maybe that same thought holds true when it comes to kids and our robot overlords.

Here’s how the study arrived at its conclusion. Three small, cute robots sat in front of two screens. On one of those screens was a single line, and on the other were three lines of different lengths. The task was to match whichever of the three lines was the same length as the single line on the other screen.

The robots went first and were programmed to give the wrong answer most of the time. The adults, of course, weren’t bothered. Which is interesting in and of itself, since studies have shown humans will conform — even in error — to other humans, but that doesn’t apparently extend to machines.

The children participating in the study, meanwhile, followed the robots’ lead 75 percent of the time. According to Bielefeld University’s Anna-Lisa Vollmer, lead author on the study, as reported by The Verge, “We know something similar is going on with robots: Rather than seeing a robot as a machine consisting of electronics and plastic, they see a social character,” she says. “This might explain why they succumb to peer pressure by the robots.”

It’s good insight to have and to keep in mind as robots become more ubiquitous in society. Since, after all, we’re kind of already starting to acclimate kids to them. What if that trend only continues to the point where we’re bringing robotic teaching assistants or some kind of recreational robot into the home for kids to play with?

The folks over at Futurism continue that thought: “Researchers have already created a number of social robots designed specifically to interact with children, from ones that help children with autism develop social skills to ones that interact with children in hospitals.

“While these robots are highly unlikely to encourage children to do anything that might cause them harm on their own, robots are machines, and people can hack machines. It’s not inconceivable that someone could use a social robot in the future to lure a child into a dangerous situation. Now that we know how susceptible children are to robot peer pressure, we know that it’s extra important to teach them to think for themselves.”

Comments