OAIC offers guidance to businesses on Privacy Act-compliant AI adoption

OAIC advises on AI privacy regulations

The Office of the Australian Information Commissioner (OAIC) has released new guidance for businesses implementing or developing artificial intelligence (AI) technologies to ensure they remain compliant with Australian privacy regulations.

The guidance outlines how Australian privacy law applies to AI as well as setting out the regulator’s expectations for compliantly adopting and using these technologies.

The first guide assists companies with ensuring selected off-the-shelf AI products meet Privacy Act and Australian Privacy Principles (APP) mandates.

The guidance is targeted at organisations deploying AI systems that were built with, collect, store, use or disclose personal information. These systems may include chatbots, content-generation tools (including text-to-image generators), and productivity assistants that augment writing, coding, note-taking, and transcription.

The guidance also includes a 10-part checklist to determine whether appropriate privacy settings have been applied when selecting an AI product.

The second guide provides advice for businesses that build or train generative AI (GenAI) models, large language models (LLMs) and multimodal foundation models (MFMs) to ensure they are compliant with Australian privacy laws.

The guidance also addresses where an organisation provides personal information to a developer so they can develop or fine-tune a GenAI model.

Considerations include privacy by design principles, accuracy risks for GenAI models, use of de-identified data and data minimisation for model training, collection and disclosure obligations, notice and transparency obligations, and consent, among others.

The OAIC cautions that development of GenAI models remains of high privacy risk due to its reliance and ingestion on large volumes of personal information.

As well, guide two also provides a checklist of considerations for developing or training an AI model.

Commenting on the launch of the guidance, Privacy Commissioner Carly Kind said: “Robust privacy governance and safeguards are essential for businesses to gain advantage from AI and build trust and confidence in the community.

“Our new guides should remove any doubt about how Australia’s existing privacy law applies to AI, make compliance easier, and help businesses follow privacy best practice. AI products should not be used simply because they are available.”