UK developments are making deepfake risk a disclosure and accountability issue for boards, not just an IT concern. In this Corporate Compliance Insights article (1 May 2026), Matt Flegg explains how deepfakes have moved from online curiosities to enterprise-scale fraud and executive impersonation—including attacks conducted live on video calls and messaging platforms to induce urgent fund transfers. He highlights publicly known cases, including a 2024 Hong Kong incident involving a realistic multiperson video meeting and an approximately $25 million loss, and a 2025 incident involving a Singaporean corporation where an AI-generated CFO impersonation led to a $499,000 wire transfer, most of which was recovered.
Matt notes that regulators are now responding: The UK’s Economic Crime and Corporate Transparency Act (ECCTA) and the Corporate Governance Code’s Provision 29 are driving new expectations around controls, disclosure, and accountability. The article underscores a layered approach—governance measures that treat seeing and hearing as insufficient verification, stronger sign-off and detection practices, scenario-based training, crisis playbooks, and third-party protocols—to help organizations build compliance and operational resilience as deepfake threats accelerate.
Read the Corporate Compliance Insights article.