Children Believe Robots More than Themselves, Study Suggests | Robotics

A recent study led by Anna-Lisa Vollmer from the Bielefeld University in Germany that children are prone to prioritising the answers given by over their own, even in cases where the robot companion is obviously wrong.

According to Dr Mandie Shean, an expert in education and child resilience, considering the amount of time children spend in front of screens nowadays, this could have decidedly dire consequences.

In the study, two groups of 7 to 9 year-old children (43 subjects in total) were asked to match a set of vertical lines on a computer screen by size. The control group completed the task by themselves, while the intervention group worked in the company of robotic assistants.

Prior to the study involving children, it was conducted on adults, showing them to be fairly resistant to “peer pressure” exerted by robots.

The same, unfortunately, could not be said about children – as many as 75% of youngsters were found to be easily misled regardless of the correctness of the proposed answer.

“This raises opportunities as well as concerns for the use of social robots with young and vulnerable cross-sections of society; although conforming can be beneficial, the potential for misuse and the potential impact of erroneous performance cannot be ignored,” wrote the researchers in their paper.

New study raises concerns over children’s interactions with social robots. Image courtesy of University of Plymouth.

While most adults have typically been introduced to computers later in life – when they have already acquired a degree of social competence and trust in other people – children growing up today don’t always have that luxury, which potentially leads to seeing computers as the primary source of truth.

Rather than emphasising children’s level of confidence in themselves, Shean, who was not involved in the study, advocates for developing their critical thinking skills.

“Why do you believe that? What are you basing that on? What’s you evidence? I teach my students here: disagree with me, and just give me a reason why,” said Shean.

In conclusion, the researchers noted that autonomous social robots are no longer a thing of the future and that a serious discussion is therefore required regarding whether “protective measures, such as a regulatory framework, should be in place that minimise the risk to children during social child-robot interaction and what form they might take so as not to adversely affect the promising development of the field”.

Sources: study abstract, plymouth.ac.uk, abc.net.au, smithsonianmag.com.

You might also like
Leave A Reply

Your email address will not be published.