If there are 25 goats and 10 sheep on board a ship, how old is the captain? A while back researchers discovered that if you ask children aged eight this question, without hesitation three quarters of them will answer “26+10 = 36 so the captain is 36”.
At the time this generated some concern about the school system where children seemed to learn tricks, but did not seem to learn any degree of critical thinking. Fortunately, more in-depth research subsequently showed that pupils aged two years older could see that they were completely unable to answer the question.
‘The more data, the better’
I would love to say that by extension this applies to TU Delft students and that they would never fall into a trap like this. But to be honest, not too long ago during an exam I was trying to work out random matrix decompositions for a question that in the end, had a typo. This is called the ‘didactic contract’ which, if I understand it right, means that it was not my fault.
I often think about this research when I hear the enthusiasm with which the subject of artificial intelligence is talked about at TU Delft. The questions that we want to ask intelligent computers often resemble the question about the captain: here is some data, here is a question that may be related to the data, just try it.
They often do generate some good answers, which explains the enthusiasm. The more data, the better and this is why everyone was talking about Big Data a while back. Until the Snowdens and the Zuckerbergs of the world exposed the ill effects of the term. Apart from being renamed ‘artificial intelligence’, the technology does not seem to have suffered much.
‘A computer has an even less critical view than a child of eight’
But it goes without saying that a computer has an even less critical view than a child of eight. If you don’t do it just right, you will quickly teach your computer to recognise snow, while what you are really trying to do is make the distinction between Siberian Huskies and wolves. Or a perhaps less harmless example is a tech giant with facial recognition software that sees black people as gorillas. Or you create an algorithm that determines who has right to care or where you need to send the police, but in doing so unintentionally further widen the inequality gap. Hopefully these examples are not intentional, but they could be the reality.
Children these days don’t learn to calculate in their heads anymore. Partly because they don’t seem to have teachers anymore. When I was at school, the statement ‘Teacher, there are calculators to do this!’ was quickly quashed with ‘Yes, but you don’t always have a calculator with you, do you?’. You can’t make this case anymore. Let’s hope that at the very least, the ability to think critically will survive. We will need it very badly if artificial intelligence is indeed the future.
Noor van Driel is about to graduate in Systems & Control (Faculty of 3mE). She enjoys pub quizzes, but has never won.