Campus

“The PedoBot is equipped with therapeutic software, tailored to the individual needs and risk profile of the client”

Social robots will gain a major role in the society of the future, writes Jos de Mul in his book, Kunstmatig van nature, onderweg naar homo sapiens 3.0. Professor Pieter Jonker, professor of vision-based robotics at the Faculty of 3mE, knows how realistic that vision is.

‘2024 – attitudes towards paedophilia have changed drastically in recent decades. Whereas in the second half of the twentieth century there was greater tolerance, that tolerance saw a sharp decline in the first decades of the twenty-first. This was reflected not just in terms of the legal penalties involved. After being released from detention, paedophiles were increasingly hounded out of the areas they lived in, thereby reinforcing their social isolation, and therefore the making it more likely that they would re-offend, even after undergoing therapy.

For this reason, experiments were carried out in the Netherlands from 2009, in the wake of those in Canada and the-then United States, involving intensive and long-term supervision by buddies, who in many cases were friends from paedophiles’ immediate environment. Even though this supervision considerably reduced re-offending rates, it appeared that finding suitable buddies was something of a bottleneck when it came to applying the scheme on a large scale. The first experiment using a Japanese PedoBot®, an affective, android robot specially designed for this buddy task, was launched in the Netherlands in 2018 in response to this need. The aim of this social robot is to prevent re-offending. The robot functions not just as a buddy, but also monitors the movements of the client 24 hours a day, as well as his emotional moods. If necessary, the PedoBot can perform therapeutic interventions. It is equipped with therapeutic software, tailored to the individual needs and risk profile of the client. If the client shows undesirable behaviour and psychotherapeutic and neurotherapeutic interventions do not help, the PedoBot informs the police, passing on the client’s geographical and emotional coordinates.

One of the most striking findings of the PedoBot experiment is the strong emotional attachment on the part of the clients. According to experts, this is attributable not so much to a variant on the Stockholm Syndrome, but rather to the advanced empathic capacities of the PedoBot. As a result, the aggression that had been feared, of the kind that occurs towards the RiotBots used for the last few years to keep demonstrators and football supporters in check, has not materialised in the case of the PedoBot. Moreover, there has been almost no legal action on the grounds of invasion of emotional privacy – in contrast to the situation that followed the introduction of the TerroBots after the attack with the modified Spanish Flu virus at Charles de Gaulle Airport in 2015. And indeed. the fears of dehumanisation that arose in 2016 after the Minister for Security and Prevention, Teevers, introduced the JailBots in prisons during the first Wilders government, proved unfounded. On the contrary, in the case of both the JailBot and the PedoBot, the criminals/clients developed feelings of friendship or even love towards their robots, although experts prefer to talk of a ‘para-affective attachment’, on account of the non-mutual nature of such relationships.

Whereas the introduction of the PedoBot in 2018 was broadly welcomed by society, the proposal in 2020 for replacing virtual child pornography by pedoid robots met with substantial resistance. This was in spite of the experiences from the synthetic forms of child pornography that were legalised in 2015 following an amendment to Article 240b of the Criminal Code, which were particularly positive due to the spectacular decrease in child abuse. Although the experiment with the PedoBot Junior (soon dubbed the ‘LolitaBot’ by the general public) promises a further reduction according to experts, critics believe that these highly realistic child robots blur the distinction between actual and virtual abuse and undermine people’s ability to exercise moral self-control. For this reason, Professor of Ethics Harry Bouhari, a prominent member of the Christian-Islamic Appeal party, advocated in vain for a blanket ban on android robots.

Bouhari’s plea is not unrelated to other problems relating to social robots. There was much controversy in 2019, for example, after a criminal gang had used a hacked android RiotBot in a bank robbery, in which three people died. In particular, the fact that the gang had succeeded in by-passing Asimov’s First Law of Robotics (‘a robot may not injure a human being or, through inaction, allow a human being to come to harm’) – previously considered inviolable – led to considerable anxiety. The investigation into Shanghai Persuasive Technologies, which revealed that the company had been supplying Memory Morphing technologies (with which TeleTouch Smartphones can be used for activating unconscious desires in the brain of the user) to European companies on a large scale, also added to the growing criticism of the further development of robotic modifications among people. And of course there is the nightmare image of the ‘safeguarding of the individual’, of the kind that has been taking place in North Korea since 2019, whereby a Juche neurochip is implanted into the brain of every new-born baby. In the United States of Northern Europe, these developments have led to strict legislation in relation to persuasive neurotechnologies.

Meanwhile, the Christian States of America (CSA) went so far as to ban affective and persuasive technologies completely in 2021. Following the disastrous experiments involving corrective genetics in Russia, the governing Tea Party is urging President Alvarez to ban this form of human genetics as well. According to American experts, PedoProfiling is at least as effective, but because of the high costs of the preventive internment of potential paedophiles, there are calls in the CSA for paedophiles and other categories of pathological mutants to be exiled to Mars. Now that the first flight to the planet has been such a success – thanks to the De Mol Entertainment Struggle for Life Show – this is just one of many plans for the commercial exploitation of Mars. However, as large-scale emigration to the planet is unlikely to become a reality for the time being, as a result of the worldwide success of synthetically produced foodstuffs and fuel, urgent efforts are underway at finding new revenue models. In its response to these plans, the World Council has warned that Mars could become the ‘Australia of the twenty-first century’, as a result of the American paedophile deportation programme, not least because the inadequate AI-legislation governing extraterrestrial robots is giving the World Council much cause for concern.

The Netherlands, too, has experienced the uncontrollability of robots equipped with biochips. One of the most shocking examples took place in 2019, when a malfunctioning PedoBot Junior sexually abused a group of toddlers in Nijmegen. This brought the existing differences between Hubbies and Real People, as proponents and opponents of social robots are referred to (after the successful Swedish science-fiction series, Real Humans, which was aired between 2013 and 2018), even more sharply into focus. The public ‘disassembly’ of the perpetrator, which followed the court case brought against Samsung by the parents of the abused children, led to violent demonstrations against this ‘anti-robot’ penalty in Seoul, Tokyo, and other Asian cities.

However, it is unlikely that EmoBots will be banned in the United States of Northern Europe. One reason for this is the overwhelming success of the Korean android LoveBots, which became popular in the second half of the last decade in the illegal prostitution and pornography industry, but which more and more people are choosing as their life partner, as in the Democratic Han Republic. And following on from Japan, where this custom emerged at the start of the millennium, the first funerals of robot partners have taken place in the Netherlands.

It is therefore hardly surprising that the Party for Robots, which was founded in 2019, backed the inclusion of robot rights in the Northern European Constitution in its election campaign. However, many people doubt that things will go as far as they have in Japan, where the first android robot was elected to parliament in 2020. The recently enacted restrictive legislation regarding the use of test robots in cyboneurological research also makes clear that since the introduction of affective robots, there has been a gradual shift in how we deal with intelligent artefacts.’

Excerpt from the book Kunstmatig van Nature, Onderweg naar Homo Sapiens 3.0 [Artificial by nature, towards Homo Sapiens 3.0]

© 2014 Stichting Maand van de Filosofie and Jos de Mul. Published in cooperation with Lemniscaat.

​Response by Pieter Jonker

Jos de Mul describes a view of society in 2024 in which social robots play a major role. In doing so, he assumes that both the physical body and the artificial intelligence of the robots will have reached such a level that they are not only capable of surviving in a world inhabited by people, but also of understanding people and even influencing them to a significant degree.

But what is the current state of research? As regards the physical body of robots, we can make robots of 1 metre 40 in height that walk, and which can even balance themselves on one leg. However, it will take another ten to twenty years to create a dynamic balancing virtuoso like Johan Cruijff, because the engineering problems that that poses are still too complex. Nor are the batteries or motors adequate; there is too much weight in relation to their capacity.

We can teach robots types of behaviour – like walking, going up and down stairs, standing up – through rewards and punishments, and we can even do so hierarchically, such as with a moving robot that has taught itself the basic skills of dribbling and shooting a ball, and which also has the ability to decide when to stop dribbling and to actually shoot at the goal. However, learning complex and refined motor skills is still a bridge too far, on account of the complexity and the limited algorithms. This, too, will take another ten to twenty years.

As far as cognitive aspects are concerned, robots are able to shift their attention: they are capable of recognising and following people. They can recognise faces and voices and – to a limited degree – interpret spoken orders, as well as identify emotions and recognise actions. We apply this technology in fall-detection cameras for elderly people who live alone.

The problem here is fitting all the various pieces of the puzzle together in order to create a properly functioning cognitive robot. It will be another ten to twenty years before this is possible, too.

The solution lies in learning systems, both for physical and cognitive skills. This is also where people and animals have sought solutions in evolution in order to survive in the complex world. In this way, robotics can avoid rigid automatons and controlling devices.

It is here that the greatest flaw in Jos de Mul’s reasoning lies. He assumes highly complex systems that are programmed, but such systems can never adapt to new situations for which they are not programmed. In addition, he assigns these future systems with the defects of today’s technology; this is like arming yourself to fight the last war.

I believe that we will see robots in the future whose physical and cognitive behaviour is acquired. The real problem with learning robots is this: who has taught them what they know? A teacher provides initial solutions, which should lead to an optimum solution through training and rewarding, but there is no greater disaster than a bad teacher or a poor upbringing. There is therefore the likelihood that in one hundred years’ time, there will be problem robots that steal, become addicts, or nurture jihadist sympathies. The real problem with robots will not be, as Jos believes, those who have hacked them, because that will be passé by then, but rather, those who have taught them.

Professor Pieter Jonker is a professor of vision-based robotics at the Faculty of 3mE. Last spring, he took part – with Jos de Mul and others – in the G8 of philosophy in the Beurs van Berlage, dealing with ethics and robotics, on the occasion of the Month of Philosophy.

Editor Redactie

Do you have a question or comment about this article?

delta@tudelft.nl

Comments are closed.