What's New :
GS Foundation Course 2026-27, Click Here
21st July 2025 (12 Topics)

EU Age-Verification Plan under the Digital Services Act

Context

The European Commission is developing an age-verification app under the Digital Services Act (DSA) to prevent minors from accessing harmful online content, particularly pornography. However, this move has sparked debates over privacy and data protection.

  1. Objective of the Age Verification Plan
  • The European Commission aims to safeguard minors from harmful online content (e.g., pornography) by ensuring only adults can access it.
  • Focus is on enforcing age restrictions for accessing sensitive platforms while protecting user privacy.
  1. Legal and Regulatory Framework
  • The initiative is part of the Digital Services Act (DSA), which seeks to regulate major online platforms, including tech giants and content-hosting websites.
  • Age verification app to be developed in line with the European Digital Identity Wallets (EID) standards and ready by end of 2026.
  1. Countries Participating in the Pilot Project
  • France, Germany, Denmark, Spain, Greece, and Italy are part of the pilot phase.
  • France is leading trials and is expected to enforce mandatory age verification.
  1. Technical and Privacy Provisions
  • Age verification app will allow users to prove age without revealing identity.
  • Open-source and based on zero-knowledge proof cryptographic protocols—meaning platforms can verify age without accessing personal data.
  • Aims to prevent the collection, storage, or misuse of personal identifiers.
  1. Support and Opposition

Support:

  • French President Emmanuel Macron backs social media restrictions for users below 15 years.
  • European regulators stress need for privacy-centric

Criticism:

  • Critics, including privacy advocates and major tech companies, argue the system is impractical, vulnerable to misuse, and violates privacy rights.
  • Aylo, the parent company of Pornhub, opposed the move, calling it “device-based age verification.”

Significance

  • The EU’s approach marks a paradigm shift in child protection policy in the digital age, making it a legislative model for global emulation.
  • It acknowledges the emerging psychological and social risks faced by children online, such as:
    • Early exposure to harmful content.
    • Loss of anonymity.
    • Digital grooming and cyberbullying.

Concerns

  • The move raises serious questions on user consent, data protection, and freedom of expression.
  • Major platforms are resisting due to potential market loss, compliance burdens, and technical constraints.
  • There is ambiguity in determining what constitutes “harmful” content and uniform enforcement across EU nations.

Digital Services Act (DSA):

  • The Digital Services Act (DSA) is a comprehensive regulatory framework enacted by the European Union (EU).
  • Came into force in November 2022 and is directly applicable across all EU Member States.
  • It replaces the outdated e-Commerce Directive (2000), marking a significant overhaul of digital governance in the EU.
  • Objectives of DSA
  • Protect Fundamental Rights: Ensures online platforms respect freedom of expression, privacy, and non-discrimination.
  • Ensure Online Safety: Aims to protect users—especially minors—from illegal and harmful content, products, or services.
  • Accountability of Tech Platforms: Introduces due diligence obligations for online platforms and intermediaries like Facebook, Amazon, YouTube, etc.
  • Promote Innovation and Fair Competition: Seeks to establish a level playing field in the EU Single Market by regulating dominant digital players (e.g., Google, Meta).

 

PYQ:

“What are the challenges in regulating digital technology and ensuring user privacy? How can data protection frameworks balance innovation and individual rights?”   (2021)

X

Verifying, please be patient.

Enquire Now