August 2025 KPMG Australia Contents Executivesummary03 1.Introduction04 2.Current AI policy context in Australia06 3.A quantitative evaluation:Does regulation influence productivity?07 Contacts12 Executivesummary Previous work in this series includes:Trust in ArtificialIntelligence: Global Insights 2023; Achieving TrustworthyAI: A Model for Trustworthy Artificial Intelligence; Trust inArtificial Intelligence: A five-country study; andTrust inArtificial Intelligence: Australian Insights 2020. KPMGAustralia is also a proud Anchor Partner of the HumanTechnology Institute, a cornerstone in our pursuit ofTrusted AI. Purpose of this paper This paper examines the evolving policy and regulatoryframework surrounding AI in Australia. It builds on KPMGAustralia’s June 2025 submission to the Productivity Inquiryon Harnessing data and digital technology,1and October2024 submission to Department of Industry, Science andResources on the introduction of mandatory guardrails forAI in high-risk settings.2 This paper supports KPMG’s previous position that AI-specific regulation, particularly in high-risk contexts, iswarranted and timely. Our findings, based on economicmodelling, suggest that countries with tighter existingregulations stand to gain more from relaxing restrictions,while those with more liberal frameworks see smallerproductivity improvements from further deregulation. Thisunderscores the importance of avoiding both extremes.Excessive regulation can stifle innovation and slowproductivity growth, while insufficient oversight may fail toaddress emerging risks, market failures and create thenecessary environment of trust the public is looking for. The analysis focuses on the risks of under- or over-regulation and explores the broader productivity implicationsof getting AI regulation right, our so-called ‘Goldilocksposition’ for regulating AI in Australia. Productivity remains a cornerstone of long-term economicgrowth and national prosperity. However, Australia, likemany advanced economies, has been experiencing a periodof subdued productivity growth, heightening concernsacross government, industry and the broader communityregarding how and when this productivity malaise will turnaround. In this context, getting regulation right, particularlyin fast-moving areas such as artificial intelligence (AI), isincreasingly critical to economies like Australia. The optimal approach lies in a balanced, proportionateframework, the ‘Goldilocks point’, where risks aremanaged without undermining innovation. Effectiveregulation of AI is not merely a technical exercise; it iscentral to fostering innovation, safeguarding public trust,and unlocking productivity gains. Drawing on our October 2024 submission on Mandatoryguardrails for AI in high-risk settings, which addressed16 consultation questions and offered 14 targetedrecommendations, this paper revisits the foundations of‘best practice’ regulation in a rapidly advancingtechnological environment. For Australia, the absence of a comprehensive legislativeframework presents a unique opportunity to design a fit-for-purpose system informed by international best practice andtailored to local needs. Aligning with global norms such asthe EU AI Act and Canada’s AIDA may reduce trade frictionsand compliance burdens, while allowing flexibility inenforcement. Importantly, regulation must account fordistributional effects. KPMG has actively engaged in the safe and responsibledevelopment of AI in Australia and globally. KPMG hasprovided a number of submissions to various forums on thistopic, including:Safe and Responsible AI in AustraliainAugust 2023;Automated Decision Making and AI regulationin July 2022;An AI Action Plan for all AustraliansinDecember 2020; theAustralian Data Strategyin July 2022;andHuman Rights and Technology in 2020 and BeyondinMarch 2020. KPMG published a report with the AustralianInformation Industry Association (AIIA) in March 2023,Navigating AI: analysis and guidance on use and adoption,which examines the global and domestic regulatorylandscape in the AI space. KPMG has published a numberof other relevant reports on AI, including:A ProsperousFuture: Emerging Techin collaboration with AmChamAustralia in 2022; Top risks to Australian Business 2024–25in 2024; andAI Amplified: What Gen Zs think of AI by Year13in collaboration with KPMG and Microsoft in 2024. In this regard, KPMG believes the full adoption of the currentVoluntary AI Safety Standards as a potential legislativeframework for AI regulation in Australia would not beappropriate; given its uniform approach to managing AI riskregardless of the type of AI tool being used or the size orcomplexity of the business utilising AI. Simply, smaller firmsare more vulnerable to the fixed costs of compliance andmay face barriers to AI adoption if rules are overly complexor if the regulatory burden falls on the AI user (as opposedto the AI provider). Proportionate and scalable regulation,especially for high-risk