Case Studies

Real-world AI incidents and lessons learned.

Learn from documented AI failures across industries. Each case includes root cause analysis, impact assessment, and control recommendations.

2023Samsung Electronicscritical

Samsung Semiconductor Source Code Leak

Engineers leaked proprietary semiconductor source code and internal meeting notes to ChatGPT while seeking coding assistance.

2024Air Canadahigh

Air Canada Chatbot Refund Policy Hallucination

AI chatbot fabricated a bereavement fare discount policy, resulting in legal liability when the airline refused to honor the non-existent offer.

2018Amazoncritical

Amazon AI Recruiting Tool Gender Bias

AI recruiting system systematically downgraded resumes containing words associated with women, reflecting historical hiring bias in the training data.

2016Microsofthigh

Microsoft Tay Twitter Bot Manipulation

AI chatbot was manipulated by users to generate offensive and inappropriate content within hours of launch.

2023Levidow, Levidow & Obermanhigh

Lawyer Cites Non-Existent Cases from ChatGPT

Lawyer submitted court filings containing fabricated case citations generated by ChatGPT, resulting in sanctions and professional embarrassment.

2021Dutch Tax Authority (Belastingdienst)critical

Dutch Tax Authority Childcare Benefits Algorithm

Algorithmic fraud detection system disproportionately flagged families with dual nationality, leading to wrongful benefit denials and government resignation.

2021Zillowcritical

Zillow iBuyer Algorithm Pricing Failures

AI-powered home buying algorithm consistently overpaid for properties, resulting in $569 million loss and business unit shutdown.

2022Clearview AIcritical

Clearview AI GDPR Enforcement Actions

Facial recognition company fined by multiple European regulators for collecting biometric data without consent from social media.

Apply these lessons to your organization

Use our Controls Library to implement the safeguards identified in these cases, or start with the Decision Flow to classify your use cases by risk.