Science

‘Safety and security are part of almost every technological development’

He wants the Safety & Security Institute to make the world safer, but why are safety and security so important to technology ethicist Behnam Taebi?

Behnam Taebi: “I am interested in the social and ethical issues associated with technological risks.” (Photo: Sam Rentmeester)

Lees in het Nederland


By: Desiree Hoving


Led by the Safety & Security Institute, TU Delft, TNO and the Netherlands Police are expanding their collaboration in order to conceive innovative solutions for national security issues. Behnam Taebi is the Institute’s Scientific Director. PhD research is taking place on a range of topics, such as methods for making it easier to find people of interest in the huge amount of data available, and improved risk profiling using artificial intelligence (AI). Two large-scale research projects started earlier, one on the optimum deployment of available police officers in certain situations, such as riots or a fleeing burglar, and the other on using smart robots in dangerous situations.


Now the ink has dried on the letter of intent, what is the next step you will be taking in this collaboration?

“We want to expand the existing research projects and explore new research fields for collaboration. The police are offering some problems we can research jointly. At the same time, we see that TU Delft has new technologies and innovations for the problems of the future. We can examine security problems from various academic disciplines and only by working with important social partners can we find answers. The three-sided collaboration with the Netherlands Police and TNO is a result of this.”


‘I can easily imagine that a drone like this gives rise to questions from citizens’


Do you dare predict what kind of questions the Netherlands Police will be asking in the near future?

“I think we can expect a huge diversity of questions. One hypothetical case is how drones can be used for security issues. Drones are being increasingly deployed for crowd control, and in some countries even for enforcing coronavirus measures. Drones can also be deployed in hard-to-reach areas or to help the fire service in firefighting. I can easily imagine that a drone like this gives rise to questions from citizens: can I check what exactly is being detected, are the images stored, can faces be identified on the images and what happens to the data? These are crucial questions that you need to answer during the development of technology and where you need to include ethical considerations.”


What ethical considerations do you mean?

“The enormous possibilities opened up by artificial intelligence, for example. AI uses a huge amount of data in an intelligent way, without us knowing precisely how. This makes AI a lot like a black box; a lot of academic work is being done to open the black box and make AI applications explainable and transparent. That may be extra relevant for police work. Explainability – or being accountable for a conclusion – and transparency are central social values. Serving [the values of the rule of law] with vigilance is a motto of the Netherlands Police.”


You have been Scientific Director of the Safety & Security Institute for a year now. Why were you invited for this position?

“I studied Material Science and Engineering and gained my PhD in the Philosophy of Technology. Since then I have focused on the long-term risks of energy and climate issues. How can we gain a better understanding and control of large-scale and long-term risks? I am also very interested in the social and ethical issues associated with technological risks. I think I was asked because of my interest in security, but also because of my background in engineering and philosophy.”


What is your intention for the Safety & Security Institute?

“The institute aims to make the Netherlands and the whole world safer and more secure, particularly in the area of technology development. Safety and security are multifaceted and complex issues; we need to take account of safety risks arising from an accident while at the same time making our technologies more secure against deliberate misuse by malicious parties. Within our institute we want to study safety and security as an integral entity.”


Can you give an example of the different aspects of safety and security?

“The traditional task of engineering is of course to ensure safety. As technology has grown increasingly complex, so has ensuring safety and security grown more complex and challenging. In the last 50 years, thinking about risks has developed much further, particularly from a desire to prevent – or at least reduce the risk of – large-scale accidents such as in a nuclear power station or an airplane crash. This falls in the category of safety risks. Security is about designing technology so that it can withstand sabotage and attacks from outside. In 2020 this is of course not a luxury but bitter necessity.”


‘How far are we prepared to hand over part of our human autonomy?’


What aspect of safety and security particularly fascinates you?

“Scientifically speaking, I am fascinated by its multifaceted nature and complexity. For example the question: whose safety? We have long since stopped developing technology purely to ensure the safety of the user. And you sometimes see that solving problems creates new ones – that intrigues me as an engineer.”


Can you give an example of a problem that creates a new problem?

“Among other things, the self-driving car was designed to reduce car accidents caused by human error. At the same time, you need to also give good thought to the infrastructure, the communication between cars and how a car makes ethical choices; who is responsible if there is an accident? This is my fascination for safety and security: they are part of almost every technological development. You must also consider the deeper philosophical question: How can you give technological shape to the interaction between man and machine in a meaningful way, and how do you regulate this interaction?”


Precisely how does the interaction between man and machine relate to safety?

“Here there is a strong relationship between safety and autonomy; how far are we prepared to hand over part of our human autonomy to the car? On the one hand a self-driving car does bring more safety, because there are statistically fewer accidents. Yet at the same time the car is dependent on the driver who needs to monitor the car’s behaviour and intervene if the car makes a mistake. So it remains a man-machine interaction. How could you train the driver of a self-driving car to remain alert? And to what extent do we surrender autonomy to the machine? How can we meaningfully regulate man-machine interaction?”


Is this also something you work on at the Safety & Security Institute?

“In our institute we distinguish four themes. First, improved calculation of uncertainties and optimum understanding of risks; second, ‘safe by design’, in other words designing so that safety is seen as a core value; third, being able to reduce consequences; and finally, failure analysis. For this last point I can give another example from the automotive industry. TU Delft’s solar-powered cars are world famous, particularly because of the cutting-edge technology used, and of course because our students win world-class prizes in them. But problems with technology have caused accidents as well. It is important to understand these problems and learn from them. We want to learn more often from accidents so we can design our technology better and make it safer. Our institute focuses on cyclical thinking about safety. On the one hand we want to prevent accidents, but on the other hand the world always becomes safer after an accident, at least, if we learn sufficiently from it. This is sometimes called the safety paradox.”


In ten years’ time, when would the collaboration with TNO and the Netherlands Police be regarded as successful?

“I would consider the collaboration a success if we succeed together in conceiving technological solutions that are well-embedded societally and ethically. In short: if we take an interdisciplinary approach to safety and security. And I hope that by doing this we can inspire the rest of the world to also take such a broad view of safety and security.”


CV

Behnam Taebi has been Scientific Director of the Safety & Security Institute since September 2019. He has worked as a technology ethicist in the Faculty of Technology, Policy and Management since 2005. Internationally, Taebi is a co-founder of the research field surrounding the ethics of nuclear energy. Together with Sabine Roeser, he put together the book The Ethics of Nuclear Energy: Risk, Justice, and Democracy in the Post-Fukushima Era. He is also the author of the book Ethics and Engineering for Cambridge University Press. Taebi is co-Editor-in-Chief of the journal Science and Engineering Ethics. He is a member of The Young Academy of the Royal Netherlands Academy of Arts and Sciences and a member of the OECD Expert Group on Transdisciplinary Research for Addressing Global Challenges. He studied Material Science and Engineering and the Philosophy of Technology at TU Delft.


This article was first published in Delft Integraal, the alumni magazine of TU Delft.

Editor Redactie

Do you have a question or comment about this article?

delta@tudelft.nl

Comments are closed.