A law to protect consumers from deep fake ads
India is still lagging in defining pertinent terms such as consent, personality rights, deep fakes, dark patterns, AI and its limits that encapsulate the rights of consumers, data principals and personalities.
Dark Patterns and Violation of Personality Rights
- Landmark Court Order: Delhi High Court's recent interim order recognizes that using advanced deceptive techniques, like deepfakes, to mislead for profit infringes on personality rights.
- Defining Dark Patterns: Dark patterns are deceitful design practices misleading users, violating consumer rights, and constituting unfair trade practices.
- Challenge of Unknown Perpetrators: Most offenders employing deepfakes remain unidentified, limiting legal action to injunctions without substantial consequences.
Celebrities and Social Media Platforms
- Legal Battles Expected: The ruling could lead to a surge in cases involving celebrities safeguarding their personality rights and consumers seeking reparation against celebrities, advertisers, and social media platforms.
- Responsibility of Platforms: Social media platforms bear responsibility in preventing deep fake-laden advertisements. Instances like Karen Hepp v Facebook highlight this concern.
- Inadequacy of Self-Regulation: The current self-regulation model by companies is insufficient, necessitating government intervention and mandatory disclosure of deepfake content in advertisements.
Call for Comprehensive Regulation
- Need for Unified Regulation: AI advancements necessitate a comprehensive legal framework, akin to the EU's AI Act, to regulate its consumer market applications.
- Clarity in Definitions: Clear definitions of terms like consent, personality rights, deepfakes, and dark patterns are crucial for safeguarding consumer and personality rights.
- Guidelines for Consumer Protection: Comprehensive guidelines under the Consumer Protection Act should mandate disclosure of deepfake usage in ads, addressing the evolving threat of sophisticated deepfake advertising.