Robots: a security opportunity or a security threat?

Jun 23, 2023Webinars and Blogs

The webinar was organised with the help of Krisandren Pillay and in collaboration with School of Criminal Justice, University of South Africa

There are well established advantages to using robots in security roles. For example, they can independently perform programmed tasks with minimal human oversight, including conducting routine patrols and doing so cost effectively, more productively than humans since they don’t get tired or bored, and work more safely in hazardous environments. But they can lead to job losses, the initial investment can be substantial, and then there are the privacy threats. Research has shown that  industrial robots can be hacked to steal trade secrets, damage equipment, cause bodily harm and corrupt processes. So where does this leave us? This webinar will discuss:

  • What are is the potential for using robots in security work? 
  • How can the downsides be best managed? 
  • Are robots a threat to humans? 

Chair: Professor Martin Gill

Prof Shandré Kim Jansen van Rensburg – Director of Advanced Development Applications, Leaderware; Adjunct Associate Professor, Edith Cowan University, Perth; Australia Associate of Super Recognisers International, UK
Dr. Craig Donald – Lecturer, University of South Africa/Universiteit van Suid-Afrika
Andrew Seldon – Editor, Smart Security Solutions

The webinar was organised with the help of Krisandren Pillay and in collaboration with School of Criminal Justice, University of South Africa

Key points

Andrew Seldon predicts that over the next seven years the scope for robots taking over security job roles from humans is limited.  True, they have advantages in patrolling hazardous sites and in being involved in bomb disposal, but that is limited. The exception maybe drones, a specific type of robot. Rather the threat to jobs comes to those involved in developing software that drives technology for example in cameras. We are reminded that one limit of robots is that they cannot detain people, in security terms that is significant. Andrew warns that some of the claims made for robots, based on tests in laboratories rather than the real world, are exaggerated and we should be careful of that. Be careful of experimentation too when it comes to trialling concepts such as chips in brains and don’t forget that cyber criminals are quick to adapt and quick to exploit. Robots can be manipulated, they can generate privacy concerns, they can go wrong. Moreover, don’t assume that they will be cheaper, that has yeet to be proven, they need to be maintained. Regulation is needed, but this is a fast-moving world and regulation does not easily fit that dynamic.

Prof Shandré Kim Jansen van Rensburg notes that robots can generate a range of benefits but they need to be managed carefully. As a criminologist Shandré reminds us of the severe crime situation that afflicts her country, South Africa, and there is a need for imaginative responses, and there is the opportunity to learn from practices in the military (where robot use is more developed) and in other countries. But caution is needed here, adaptations from both those environments need to be treated with caution. Shandré places a lot of emphasis on the need to understand the culture and context of countries when introducing robots (and other technologies), taking account of characteristics such as high crime, and levels of corruption. In a different way the common energy blackouts are important in understanding the sorts of measures that will fit the best. Robots will need to understand the needs of and meet the requirements of Africa. Regulation has a role to play, but it needs to be good, flexible, and be implemented effectively, and the world has not always been adept at that.

Dr. Craig Donald invites consideration of the stereotype of the robot. It comes in different forms, can be a drone and can be a small device the creeps like a spider to keep watch. We are reminded that robots are good at only a limited range of things, it is not a counsellor or a problem solver, it lacks situational awareness and intelligence. For example, his German Shepard dog would be more effective at challenging an intruder with a gun in hand. Moreover, robots need teaching and that takes a lot of effort, a lot! Then there is the cost, they need to be maintained and while there may be cost benefits in some tasks the jury is still out on whether overall, across a range of tasks typically needed in security roles, they can be cost effective. They do have benefits including the huge potential to provide analytics. Craig is not optimistic regulation can move quick enough to be effective; technology is moving fast. 

Overall, the panellists outlined the benefits and drawbacks. The future of AI, the evolving ability of technology to innovate against the difficulty of regulating use makes the space an interesting one. Security professionals need to watch carefully, there is a lot to be gained perhaps but a lot to lose too. 

Professor Martin Gill
22nd June 2023

What’s the latest?

Join our brand new Security & Risk Thought Leadership Newsletter