Blog icon

Subscribe to NAIC and Responsible AI Network

Keep up-to date with all the latest NAIC news, events & updates and nominate to join the Responsible AI Network!

Subscribe and nominate online


There is a global race to build guardrails for AI development and deployment to ensure responsible AI practices are developed in parallel with a rapidly accelerating Artificial Intelligence (AI) landscape.

Worldwide, Standards and regulatory changes are coming, which will require major upskilling and change for organisations to adapt to this new regulatory landscape.

In response to this global momentum and a clear industry need for guidance, the National AI Centre - together with our Knowledge Partners - are delighted to announce the Responsible AI Network, a world first cross-ecosystem collaboration aimed at uplifting the practice of responsible AI across Australia's the commercial sector.

We have partnered with the Australian Industry Group (Ai Group), Australian Information Industry Association (AIIA), CEDA, CSIRO's Data61, Gradient Institute, Standards Australia, The Ethics Centre, The Human Technology Institute at UTS, the Tech Council of Australia, and the Governance Institute to create curated advice and best practice guidance within seven actionable pillars:

  • Law
  • Standards
  • Principles
  • Governance
  • Leadership
  • Technology
  • Design

Each partner brings a specific skillset to the collaboration and Australia's AI Ecosystem:

  • AIIA has joined the Responsible AI Network to advocate and uplift the practice of Responsible AI and leveraging their influential and innovative technology company members.
  • The Australian Industry Group will support traditional, innovative and emerging industry sectors with the aim to uplift Australia's responsible AI practice.
  • CEDA brings the ability to drive AI as a fundamental driver of our economic development, and advocate for its responsible and sustainable use.
  • CSIRO's Data61 provides extensive research expertise in Responsible AI, across Responsible AI Design and Fair and Secure AI.
  • The Gradient Institute brings skills in building ethics, accountability and transparency into AI systems by providing training for organisations operating AI systems and technical guidance.
  • Standards Australia seeks to democratise the ability of all businesses to use standards to deliver responsible AI.
  • Tech Council of Australia will bring together the Australian Technology sector towards responsible AI principles and practices.
  • The Governance Institute will provide expertise on corporate governance, risk management, and corporate accountability.
  • The Ethics Centre will provide vision and discussion about the opportunity presented by AI.
  • The Human Technology Institute at UTS will focus on skills, tools and policy advice for Australia's businesses.

Together with our foundation partners Google and CEDA, the Responsible AI Network provides us with a world-first opportunity to uplift AI adoption in Australia and create world-class capability in Responsible AI.

The National AI Centre are proud to be at the forefront of this effort.

For the latest resources to inform and empower your organisation to adopt and implement responsible AI.

View our Responsible AI Network resources

Responsible AI Resources

Subscribe to the National AI Centre to stay up to date with Responsible AI Network news and events, or follow us on LinkedIn.

Is your organisation an AI enabler?

Nominate your organisation to be included in NAIC's Australian AI ecosystem discoverability platform.

Join our ecosystem

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.

First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.