You can set your preferences for social media and targeted advertising cookies here. We always place functional cookies and analytical cookies. Functional cookies are necessary for the site to work properly. With analytical cookies we collect anonymous data about the use of our site. With that information, the site can be further improved so that it is easier for you to find what you are looking for.
What is artificial intelligence? Where do you encounter it, and how smart is a computer really? These questions take centre stage during an interactive lesson by UMCG researcher Mirjam Plantinga, who, together with Bas Altenburg from 8D Games, is visiting primary school De Mienskip in Buitenpost. Together, they introduce pupils to the world of artificial intelligence (AI).
‘I’m a researcher at the UMCG. Perhaps not in the way you might expect, because I don’t study diseases or the human body,’ Plantinga begins the lesson. ‘I mainly look at how we can use smart technology effectively in hospitals.’
She explains what artificial intelligence is: computers that learn from data and recognise patterns. ‘In healthcare, for example, it can help with tasks that normally take a lot of time,’ she says. ‘Think of analysing X-rays or sorting through large amounts of patient information. That way, doctors can get a clear overview more quickly and focus on what really matters: helping patients.’
An escape room in the hospital
Together with the team at 8D Games, Plantinga developed a serious game for primary schools about the use of AI in healthcare. Altenburg kicks off the game with a clear explanation.‘Okay, listen carefully. You’re about to help a hospital. The hospital is called VITAI, and a lot is going wrong there. Computers are giving the wrong advice, information isn’t always correct and doctors aren’t sure what to do. It’s up to you to solve the problem.’ The first challenge appears on the screen.
‘Only if you work together and share everything with each other will you get further.’
‘You’ll be working in teams. Half of you will be on the laptops, while the other half will receive paper assignments. But be careful: no one has all the information. Only if you work together and share everything with each other will you get further.’ The pupils immediately pull their chairs closer together. There is whispering, pointing and enthusiastic discussion.
During the escape room activity, pupils work together to crack codes and help the hospital VITAI.
Panda… or not?
Four pictures of pandas appear on the screen. Or rather… three pandas. In one image, there is a person wearing a panda costume. ‘That one doesn’t belong!’ a pupil immediately calls out. But it turns out that an AI system sometimes identifies the disguised person as a panda as well.
Plantinga explains: ‘AI learns from examples. If a system has mostly seen images with black-and-white shapes and round eyes, it might think: this looks similar enough to what I know. But it doesn’t actually know what a panda really is.’‘You can compare it to a very fast sorting machine,’ Altenburg adds. ‘It only looks at similarities. It doesn’t think the way you do.’
Which picture doesn’t belong?
‘We have one of those at home!’
During one of the tasks, pupils are shown several examples on the screen. Their task: decide whether AI is involved or not. Pointing at a thermometer, one pupil calls out: ‘That’s definitely not AI, we have one of those at home!’ ‘But the checkout scanner really uses AI, doesn’t it? It can’t just recognise products by itself.’ ‘Facial recognition on phones? That must be smart!’ ‘What about self-driving cars?’
What first seemed simple turns out not to be so easy after all. It leads to plenty of questions and surprise. Because how do you actually know whether something is ‘smart’? The pupils discover that AI is often hidden in devices they use every day: at home, in the supermarket or out on the street, sometimes in ways they hadn’t expected.
Why start in primary school?
Why is it important to explain this to children? ‘Everyone is a patient at some point,’ says Plantinga. ‘And these pupils might become the doctors of the future. On top of that, many of them are already using AI tools themselves. It’s important that they understand how they work.’
She notices that many people think AI always tells the truth. Others see it as something frightening that might take over everything. Talking about it together helps create understanding and encourages people to ask more critical questions. When it comes to difficult decisions in healthcare, such as treatments or quality of life, human judgement remains essential. Doctors must also be able to explain how a system arrives at its advice. This is known as 'explainable AI'. ‘A doctor should also understand how a model was created,’ Plantinga says. ‘Which data was used? According to which rules? And can I explain that to my patient?’
What stays with them?
During the game, pupils enthusiastically shout: ‘See, that’s wrong!’ But another classmate calmly says: ‘We all make mistakes sometimes, that’s okay.’ That may well be the most important lesson of the afternoon. People make mistakes, but we also understand why something went wrong. We can reflect, adjust and make a different choice. We think, weigh things up and make conscious decisions. AI can also make mistakes. The difference is that a computer doesn’t understand what it is doing and cannot judge whether the outcome is correct. One pupil sums it up neatly: you have to look carefully at how you use AI and whether what the system says is actually right.
The real thinkers
At the end of the lesson, the pupils are asked what grade they would give the AI lesson. Hands shoot up: ‘A nine!’ ‘A ten!’ A great result, but the real message is clear: technology is powerful and useful, but thinking critically and judging the results remains human work. AI is a tool. We humans are the real thinkers.