Key points
- From quantum to AI, gene tech to renewables – it’s critical we understand the impacts of new technologies.
- Responsible innovation (RI) is our way of learning how future science and technology might fit in society.
- In these four videos, our scientists show how their RI research is shaping tomorrow’s innovations.
From food security to renewable energy, climate change, and healthcare – great challenges call for innovative solutions. Cutting-edge science can also disrupt society and create risks, whether that is cybersecurity threats from quantum computing or community conflict about new technologies.
That’s why we’ve made responsible innovation (RI) a core part of our work. It’s a rigorous and robust scientific process. RI helps us uncover the social and ethical risks that come with new science and technology. Better yet, it gives us a way to shape them for the better.
What is responsible innovation?
When it comes to RI, there are two main scientific frontiers.
One is called social and ethical risk management. This goes beyond identifying the social impact of science and technology on a case-by-case basis. Instead, it aims to establish new ways of doing science that can be applied across different fields and sectors.
We want to be able to reliably tell when science and technology will benefit society and when it will cause problems. We do this by talking in-depth with people and communities who create, regulate, and use novel technologies. Our researchers can then uncover social and ethical risks that might be overlooked in standard assessments.
By systematically identifying and documenting those risks, we can help develop strategies to reduce harm. The idea is to work with government, industry, and communities to actively manage the risks of new technologies – before they go out into the world.
Two of our scientists offer a glimpse behind the scenes of their RI research.
Social science and trust
Our senior research scientist Rod McCrea is working to understand what people expect of research organisations like ours – and what people's concerns are.
By understanding how people feel about science, we can work harder to build trust. After all, there’s no use developing a nutritious new food that people won’t eat, or a helpful digital tool that people won’t use.
Instead, we need to understand what drives trust, and be responsive to public attitudes about science. This is how we make sure our new technologies are fit-for-purpose, and our future science has a positive and meaningful impact.
Trust in workplace AI
Speaking of trust, it’s a huge factor in how we interact with rapidly developing technologies like artificial intelligence (AI).
Our postdoctoral fellow Melanie McGrath is researching how to build trust with AI in the workplace. The aim is to help AI become the best team member it can be.
To make teams with AI that really gel, we need to work out a great recipe for building trust. Not too little, not too much. But just the right amount to suit the systems we’re working with.
What is responsible prediction?
The other scientific frontier in RI is called responsible prediction. We’re exploring new ways to model and measure social systems and conditions.
Our scientists use sophisticated computer modelling techniques, collecting lots of data from surveys and interviews. They then analyse the data and enter it into models, to represent the complex dynamics of our social systems.
The aim is to create new techniques that can help predict how people might respond to new technologies. That way, we can find ways to avoid new technologies causing harm, conflict, or disruption to society before they’re released.
Here are two examples of our responsible prediction research.
Hydrogen transition
An example of this is our research scientist Mitch Scovell’s work.
Hydrogen is a powerful energy source, which could play an important role in Australia’s transition to renewables. It’s vital to understand how people might respond to hydrogen, to ensure a safe and smooth transition.
In his research, Mitch investigates how people form their attitudes towards hydrogen. He models complex factors like values, beliefs, and perceptions of risk.
Mitch’s research will help government and industry. It offers a way to get to grips with challenges around public acceptance before the technology is rolled out.
Modelling behaviour
Our postdoctoral fellow Tim Bainbridge is also experimenting with dynamic new modelling techniques.
Tim is interested in the diverse factors that influence how people share and interpret information. It all started after seeing how misinformation spread during the COVID-19 pandemic. Tim became fascinated by how we respond as a society during challenging times.
Like Mitch, Tim is using complex modelling to find out more about what makes people act on some information, but not on others.
Tim wants his research to help society ‘bounce forward’ after the next big global challenge, instead of returning to business as usual.
This means society becomes more resilient and better prepared for what the future may hold. It also means a more consultative, collaborative relationship between science and society.
And that’s what responsible innovation is all about. It’s our unique way of managing the unintended risks and harms that come along with cutting-edge innovation. And of ensuring that our science generates benefits for all of society.