Blog icon

By Alison Donnellan 1 October 2020 3 min read

 Many of us would pick invisibility as a potential superpower if we could, and today, technology has made this possible, just not in the way most would assume. 

Computer vision and image analysis technology has boomed in recent years and is now used to study and analyse numerous aspects of our daily interactions, from the way we move around cities, to how we enter buildings and even the unique way we walk. The use of smart IoT devices like sensors and smart cameras to recognise, detect and track objects are fast becoming the norm as we move closer towards an autonomously-assisted society. 

However, new research from CSIRO’s Data61 in partnership with South Korea’s Sungkyunkwan University and the Cyber Security Cooperative Research Centre (CSCRC) has revealed that these object detectors can be exploited with relative ease. 

Through manipulation involving the use of simple objects, researchers have discovered that it is possible for people to digitally disappear and bypass detection despite the fact they are clearly in the video. 

As machine learning detection systems are trained to analyse video footage and identify people based on a number of factors including what they are wearing and other objects, these ‘invisible cloak’ attacks can be used against these systems to pinpoint the objects they don’t automatically detect. In effect, these attacks can identify what can be worn to remain undetected by these systems. 

A demo video with two students from Sungkyunkwan University highlights how easy it is to manipulate these systems. Performed on a popular object detector architecture, YOLOv3, the demo showed both subjects detected by the program initially. However, whenever one person wore a red beanie, they would disappear and go undetected by the program (illustrated by the bounding-box around the person disappearing). 

“The simplicity of the demo carried out highlights just how prevalent this threat may be without businesses realising,” said Dr. Sharif Abuadbba and Dr. Garrison Gao, Cybersecurity Research Scientists at Data61 working for CSCRC’s Security Orchestration and Automation research theme

Attackers can carry out such an attack using several means. Firstly, many small enterprises rely on third-party cloud servers to train the deep learning models with ever-increasing size due to lack of computational resources. In this case, the third-party could tamper the model training process to insert the backdoor trigger. 

Secondly, it is common practice to apply transfer learning to a deep learning model to gain better accuracy. Transfer learning is a research problem that focuses on storing knowledge gained while solving one problem and applying it to a different but related problem to achieve more efficient results. If a pre-trained model, which has been loaded with large data and huge computational resources, is then later released by a third party, and has been backdoored at all during this period, it affects the model which has been trained on, resulting in vulnerabile data.

“The implications of these potential backdoors in video detection programs go beyond the obvious security implications. With driverless cars on the horizon, our research shows that if such a backdoor exists in the machine learning program that fuels this driving, there is a real potential for objects or pedestrians on the road to go

intentionally undetected according to the attacker’s will,” said Dr. Sharif Abuadbba and Dr. Garrison Gao, Cybersecurity Research Scientist at Data61

For businesses looking to invest in or review their IT and cybersecurity systems, acknowledgement that these backdoors may exist will help to ensure blindspots are minimised. While more research is needed to understand the prevalence of these backdoor threats, businesses can mitigate risks by avoiding the use of a single data or model set from a sole provider, and instead, opt to source multiple data sets or models from a range of different providers. 

For more information or if you're interested in research collaboration, please contact sharif.abuadbba@data61.csiro.au and garrison.gao@data61.csiro.au

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.


This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.