Breadcrumb

Pedagogical robots and AI in the classroom

New major advances in robotics make the number of applications increase. Sofia Serholt is looking into how robots interact with children in a classroom.

When more and more technology is used in classrooms, we need to get an overview of the consequences, but also to be better off from the educational purpose - what do we want to achieve?

Image
Sofia Serholt
Photo: Catharina Jerkbrant

Sofia Serholt got her PhD in 2017 with a dissertation on the interaction between children and robots, Child-Robot Interaction in Education, where robots have been tested in classroom environments to serve as a support for teachers. The studies were conducted within the EU project EMOTE, which also included the process to develop robots suitable for teaching.

Empathetic robots with the capability to interact

Apart from instructing the children, the robots should also be empathetic, thus being able to adapt in different ways to the difficulties the children had in the classroom, to see if it was possible to create a relationship between the children and the robot that could facilitate the learning process.

Sofia Serholt entered the project with a background in IT pedagogics, as a teacher in English and Applied Information Technology, to contribute the pedagogical perspective in the development of a robot. After a comprehensive design process, where several teams around Europe worked on different parts of the development, Sofia conducted tests with a robot in a classroom environment and the research studies were based on two different tracks.

- The first track was about how the children interacted with the robot compared to interaction with a human being. If the children were willing to follow instructions and how they responded socially, how they looked at the robot, if they answered questions or used gestures to communicate with the robot, things like that.

Sometimes the interaction works well, sometimes not so well

The study showed that the children responded to the robot’s social communication, for example when the robot said hello and when the children received praise. They also answered questions from the robot, even though the social response decreased slightly over time. The children were also willing to listen to instructions from the robot, but unlike how they interacted with the human teacher, they did not ask for help from the robot when they did not understand the instructions. This problem leads to the second part of the dissertation, which focuses on so-called breakdowns, cases when the interaction for some reason ceased to work.

- My study in some sense started with these problems that I could see existed, but which are rarely reported. Looking at the research, you can see that there are often some children in the study who do not participate later on in the analysis, and that is probably quite often because something went wrong.

What does it mean for the children to participate?

Sofia Serholt focused on the perspective of the users in her analyses – the teachers and students who participated in the research studies. As far as the teachers are concerned, it was partly about how they practically look at having AI in their classroom, but Sofia also did separate studies on more ethical problems, such as how teachers more generally perceive the issue of privacy. From the perspective of the children it was about how they are influenced by interacting with technology in this way.

- The children do usually not have the ability to look under the hood of the robot and understand what the robot is doing. They usually think that the robot is very smart and if the interaction does not work for example, if it is not “going well" for me, who's fault is it?

Some teachers in the different studies have also pointed out that children may have difficulties to foresee what it will mean in the future that a robot has read and saved, for example, facial expressions – that information on what triggers a person is available and that someone can analyse an individual with that material from the past. Sofia Serholt thinks there is a risk that these factors might get in the dark when new technologies are launched.

More empirical research is needed

- We can see that there is a risk when we talk about machine learning for example. It is a very useful technology which will help us to create great things, but as it becomes easier and more accessible, ethical issues arise that we need to take a stand for. For example, we assume that children are OK with being recorded, partly because they are young, partly because we feel that they are open to share everything about themselves. While the children's perspective is that they trust the adults to take responsibility and be on hand to sort out difficult questions.

Researchers who previously have been doing a lot of research on AI and children include Sherry Turkle and Amanda Sharkey, but their work had a focus on hypothetical and philosophical reasoning and they were not doing field studies. Sofia Serholt believes that complementary empirical research in full implementation is needed.

 - The usual thing is that you implement things because you believe in it, and then it becomes difficult to criticise your own product, but in this case, I think it is necessary. It is a major responsibility we have, and when more and more technologies are used in classrooms, we must get an overview of the consequences, but also start more from the educational point of view – what is it that we want to achieve with the technology?

New project where the robot will be educated by the children

Sofia Serholt now participates in a project at University West, where the robot instead of being the teacher now is the one to be taught. The children help the robot to improve in a mathematics game and the idea is that they will develop their own knowledge in mathematics by "learning by teaching".

- We are looking at what is known as the protégé effect, which means that more you put more effort in if you are teaching a friend before an exam, than if you are just studying on your own. The thesis we are testing is based on the fact that there is a driving force to teach other people and that we also want them to succeed.

The companies are the ones that develop new products and put them on the market, which requires research resources that are not always available in academia.

- We are investigating different concepts instead. It is what I see as the duty of research: trying to anticipate the products to see what consequences they could have.

SOFIA SERHOLT

Senior lecturer at the Department of Applied IT
Division of Learning, Communication and IT