Blog icon

9 October 2020 1 min read

Object detection software looking at a tourist site in Europe, scanning people.
A digital invisibility cloak could have criminal uses.

Many of us would pick invisibility as a potential superpower. Today’s technology has made this possible, just not in the way you might assume.

Computer vision and image analysis technology have boomed in recent years. As a result, smart devices like sensors and cameras are fast becoming the norm as we move towards an autonomously-assisted society.

Research from Data61, in partnership with South Korea’s Sungkyunkwan University and the Cyber Security Cooperative Research Centre (CSCRC), has revealed object detectors in these smart devices can be exploited with relative ease. Making it possible for people to digitally disappear and bypass detection.

The data behind digital invisibility

Machine learning detection systems analyse video footage and identify people based on a number of factors. These factors include what people are wearing. ‘Invisible cloak’ attacks work against these systems. In effect, these attacks can identify what clothes can be worn to bypass detection by these systems.

This demo video shows how easy it is to manipulate these systems. Initially you can see both subjects detected by the program. However, whenever one person wears a red beanie, they disappear and the program can not detect them.

Refer to original on YouTube

Share & embed this video

Link

https://www.youtube.com/embed/ICwYQDsCy1o

Copied!

Embed code

<iframe src="https://www.youtube-nocookie.com/embed/ICwYQDsCy1o" width="640" height="360" frameborder="0" allow="autoplay; fullscreen" allowfullscreen></iframe>

Copied!

Digital stealth with real-world impacts

Dr Sharif Abuadbba and Dr Garrison Gao are Cybersecurity Research Scientists at Data61. They currently work for CSCRC’s Security Orchestration and Automation research theme.

Dr Abuadbba said the simplicity of the demo highlights how prevalent this threat is without businesses realising.

“The threat of these potential backdoors in video detection programs go beyond the obvious security implications,” he said.

“With driverless cars on the horizon, our research shows if such a backdoor exists in the machine learning program that fuels this driving, there is a real potential for objects on the road to go intentionally undetected.”

For businesses looking to invest in their cybersecurity systems, acknowledging these backdoors may exist will help minimise blind spots. More research is needed to understand the prevalence of these backdoor threats. In the meantime, businesses can mitigate risks by avoiding the use of a single data or model set from a sole provider.

For the full version of this article head on over to our Algorithm blog.

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.


This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.