Deloitte caught out using AI in $440,000 report | 7.30

ABC News In-depth
9 Oct 202506:47

Summary

TLDRIn a report commissioned by the Australian government to review welfare compliance, consulting firm Deote used AI, leading to over 20 errors in citations and references. Sydney Uni law lecturer Chris Rudge discovered fabricated sources, including fake books and incorrect legal references. Despite the $440,000 cost, the report's quality was deeply questioned. The government's response highlighted concerns over AI’s impact on accuracy in research, with Deote partially reimbursing $97,000 for the errors. The case underscores the growing risks of AI-generated content in high-stakes decision-making.

Takeaways

  • πŸ˜€ A forensic review uncovered over 20 errors in a government report reviewed by Chris Rudge, revealing fabricated references and citations.
  • πŸ˜€ The report, commissioned by the Albanese government after the robo-debt scandal, was found to contain critical inaccuracies, including incorrect legal citations.
  • πŸ˜€ The consultancy firm, Deote, used AI to generate the report, which led to the mistakes and fabricated references.
  • πŸ˜€ AI hallucinations were identified in the report, including fake books and quotes attributed to non-existent sources, highlighting the risks of AI-generated content.
  • πŸ˜€ The report misattributed a key federal court case and wrongly cited judges, including incorrectly listing the name of a judge and referencing non-existent speeches.
  • πŸ˜€ The mistakes in the report sparked concerns about the reliance on private consultancy firms instead of public services for crucial government reports.
  • πŸ˜€ The Albanese government criticized Deote's lack of quality oversight, especially given the substantial $440,000 cost of the report.
  • πŸ˜€ After the errors were revealed, Deote revised the report, disclosing the use of Microsoft's Azure AI platform, but did not specify the extent of AI involvement.
  • πŸ˜€ The government demanded Deote to maintain quality assurance processes and declare AI usage in future projects to prevent similar errors.
  • πŸ˜€ The case raised broader concerns about the growing prevalence of AI-generated misinformation, especially in legal and academic fields, and its potential impact on decision-making.

Q & A

  • What did Chris Rudge discover in the government report?

    -Chris Rudge found over 20 mistakes in the government report, including fabricated references, incorrect citations, and misattributions of speeches and legal cases.

  • What was the purpose of the report commissioned by the Albanese government?

    -The report was commissioned to review the welfare compliance system following the robo-debt scandal, in which the former government unlawfully pursued welfare recipients.

  • How did AI contribute to the errors in the report?

    -Deote used AI, which resulted in 'hallucinated' citations, fictitious books, and incorrect attributions of legal cases and speeches, leading to significant errors in the report.

  • What is an 'AI hallucination' as explained in the script?

    -An AI hallucination occurs when a model generates output that is incorrect, incomplete, or not grounded in reality. In this case, AI generated fake citations and references that did not exist.

  • What specific mistake did the report make regarding a federal court case?

    -The report wrongly referred to a key federal court case and misquoted the judge, including fabricating a quote that did not exist in any real legal document.

  • How did Deote handle the discovery of these errors?

    -After Chris Rudge went public with the errors, the government asked Deote to correct the report. Deote revealed it had used AI and agreed to refund $97,000 to the government.

  • What was the cost of the report, and who paid for it?

    -The report cost $440,000, which was paid by the government to Deote for the review of the welfare compliance system.

  • Why does this case raise concerns about using private consultancy firms over public service?

    -The case raises concerns because the errors in the report suggest a lack of human oversight, which could lead to poor-quality work being used to make policy decisions. This undermines trust in the quality of consultancy work.

  • How has AI's use in academic and legal fields been problematic according to the script?

    -AI has caused issues in academic writing and legal work, where it has been used to generate false citations, quotes, and documents. This can lead to incorrect information influencing important decisions.

  • What steps is the government taking in response to the AI errors in the report?

    -The government is moving to ensure consultants declare their use of AI and maintain quality assurance processes to prevent such errors in the future.

Outlines

plate

This section is available to paid users only. Please upgrade to access this part.

Upgrade Now

Mindmap

plate

This section is available to paid users only. Please upgrade to access this part.

Upgrade Now

Keywords

plate

This section is available to paid users only. Please upgrade to access this part.

Upgrade Now

Highlights

plate

This section is available to paid users only. Please upgrade to access this part.

Upgrade Now

Transcripts

plate

This section is available to paid users only. Please upgrade to access this part.

Upgrade Now
Rate This
β˜…
β˜…
β˜…
β˜…
β˜…

5.0 / 5 (0 votes)

Related Tags
AI ErrorsGovernment ReportConsulting FirmsAI IntegrityRobo DebtQuality AssuranceFalse CitationsDeote ScandalPolicy ImpactAI HallucinationsAustralia News