The Policy for the responsible use of AI in government (the policy) provides mandatory requirements for departments and agencies relating to accountable officials, and transparency statements. The policy sets out the Australian Government’s approach to Artificial Intelligence (AI). This page provides details of the Department of Agriculture, Fisheries and Forestry’s (DAFF) implementation of these policy requirements.
DAFF’s approach to AI adoption and use
DAFF is exploring the adoption of emerging technologies, including AI, as part of the Australian Government’s broader commitment to driving digital innovation and improving services. For more information, see the Data and Digital Government Strategy.
DAFF is currently trialling the adoption of AI, while building strong governance foundations to support safe and responsible use of AI. This includes adopting assurance frameworks and measures to train and support staff to use AI in a safe way.
All staff have access to the AI in Government fundamentals training, produced by the Digital Transformation Agency (DTA), and are required to report on the use of AI to the department’s Enterprise Data Branch. The department actively encourages staff to undertake the AI in Government fundamentals training through regular communications and promotional events.
DAFF’s use of AI
DAFF is committed to demonstrating, encouraging and supporting the safe and responsible adoption of AI and to be able to clearly explain the reasons behind any use of AI or adoption of AI in the department.
Microsoft 365 Copilot
From 1 January 2024 to 30 June 2024, DAFF participated in the Australian Government’s trial of Microsoft 365 Copilot (Copilot). DAFF now continues to make Copilot available to a limited number of staff, with 300 licences currently utilised across the department. As a requirement for using the Copilot licence, staff are required to complete the AI in Government fundamentals training.
DAFF uses Copilot in the domain of Corporate and Enabling and usage pattern of Workplace Productivity. Corporate and Enabling means that AI is used to enhance corporate functions and workplace productivity by automating routine tasks, managing workflows and facilitating communication. Examples of this usage pattern in the department include document summary, content creation, code troubleshooting, transcription of virtual meeting notes, language translation and basic secretariat support.
Pilot Australian Government AI Assurance Framework
DAFF’s use of AI also focuses on trials to understand where AI investments can be made and assess potential benefits or risks.
In September 2024, DAFF participated in the Pilot Australian Government AI Assurance Framework (AI Assurance Framework). Through participation in this pilot, DAFF applied a number of aspects in the AI Assurance Framework to certain AI use case trials to test its application and impact. Although the pilot has concluded, DAFF is exploring the application of the framework to all AI use case trials and new AI use case innovations.
User acceptance policy
DAFF has an Artificial Intelligence User Acceptance requirement, which staff must confirm and acknowledge that they are familiar with before accessing generative AI tools online. This requires staff to agree that they will:
- Be responsible for ensuring the accuracy and relevancy of any content generated by any Artificial Intelligence platform.
- Avoid inputting any personal, sensitive or classified information that may be subject to third party restrictions.
Through the User Acceptance page, staff are reminded that they may only input publicly available information and must act in accordance with the Government’s Protective Security Policy Framework and Information Security Manual. Staff are also encouraged to read the department’s ICT Acceptable Use and Security Policies and Privacy Policy and are required to complete mandatory annual training on these topics.
Accountable Official
DAFF has an Accountable Official under the Policy. The Chief Data Officer (CDO) was designated as the Accountable Official on 16 October 2024.
AI safety and governance
DAFF has an internal register of AI use cases. The internal register provides transparency and monitoring of AI use cases throughout their lifecycle, including early identification, and management of risk. AI use cases must apply the AI Assurance Framework and staff need to seek advice from the department’s Enterprise Data Branch as required.
DAFF has measures to ensure:
- AI utilisation is properly governed.
- Risk management frameworks include AI specific components, ensuring mitigation and controls are implemented upon risk identification.
- AI use cases are tracked and monitored through the internal register across the AI lifecycle.
- AI use across the department is visible, enabling governance, assurance, reporting and risk management activities to occur.
- Promote the safe and responsible use of AI to departmental staff.
- Collaboration across the department and other government agencies on the use of AI, including the ongoing development of resources to ensure safe and responsible use.
Compliance
DAFF will only utilise AI in accordance with applicable legislation, regulations, frameworks, and policies.
Updates
DAFF acknowledges that the AI Transparency Statement will be updated and published:
- At least once a year.
- When making a significant change to our approach to AI.
- When any new factor materially impacts the existing statement’s accuracy.
- Published 27 February 2025.
Contact us
For any enquiries relating to DAFF’s use of AI or the information provided within this Transparency Statement, contact ai@aff.gov.au.