What's New :
IAS 2025: Target PT Prelims Classes and Tests. Get Details

The panoptic nature of biometric technology

Published: 29th Nov, 2021

Context

Facebook, now renamed as Meta Platforms Inc, had recently stated that it would be shutting down its facial recognition technology (FRT) on its platform, following a class action lawsuit against it for failing to perform necessary disclosures related to handling of its users’ biometric data.

  • This development has brought the facial recognition technology (FRT) into limelight.

Analysis

What is Facial Recognition Technology?

  • Facial recognition is a biometric technology that uses distinctive features on the face to identify and distinguish an individual.
  • Facial recognition technology has been introduced primarily for two purposes:
  • As a compare and contrast tool meant for identification based on existing information.
  • To create a repository on the basis of which the process of identification can be enhanced.

What are the concerns related to it?

  • Without proper laws protecting digital privacy, inappropriate use of facial recognition technology will enable mass surveillance.
  • A growing body of research shows that biometric scanning technologies coupled with AI have an inherent bias. There seems to be an algorithmic biasin this technology.
  • A report by the U.S. National Institute of Standards and Technology (NIST) noted that facial recognition technology found Black, Brown and Asian individuals to be 100 times more likely to be misidentified than white male faces.
  • Scanning technology and biometric tracking pose a grave threat to freedom of expression, a fundamental right as envisaged by the Indian Constitution.
  • There have been numerous instances when this technology was used by law enforcement agencies to crack down on protestors even in legitimate causes.
  • 100% accuracy in finding matches has not been achieved under this technology. Facial recognition does not return a definitive result. It identifies or verifies only in probabilities .

Why is it important?

  • FRT can act as valuable tool for the law enforcement agencies to nab criminals.
  • Biometric recognition creates a specific link between an individual and a data record. Physical characteristics and behavior patterns are personally unique, and unlike passwords or PINs, can't be deciphered or recreated by sophisticated hacking software
  • Biometric authentication thwarts fraudsters’ efforts to create multiple fake digital identities. Culling through existing biometric data exposes people who already registered with a different identity.
  • Biometrics are not exchangeable. Because fingerprints or keyboarding patterns aren’t transferable, biometrics is especially effective in protecting sensitive information such as financial data or healthcare records.
  • Open banking demands reliable digital identity protection.Banks choosing to increase their online product offerings by opening their digital infrastructure to third party fintech providers need the reliable and specific digital identity protection biometric security delivers.
  • Biometrics balance convenience, security, and user experience. Users forget passwords and PINs; their fingerprints are always available. Consumers lose smart cards or tokens; they always have their face. User-friendly biometric systems deliver exactly what consumers want: frictionless and secure user experiences.
  • Inborn biometrics don’t change. A haircut or cosmetic surgery can alter facial contours, but inherent biological traits do not change over a lifetime.
  • Stolen biometric data can be challenging to use.Today's sophisticated biometric systems use "liveness" elements to detect spoofs example fake images, and some fingerprint scanners have pulse detectors.

What are the dangers associated with biometric technology?

  • Infringement of Privacy:The privacy of users’ data is at stake with the technology. In absence of regulations it would expose data to cyber criminals. Companies are not regulated, thus they may sell biometric data which can be misused for political purposes.
  • State surveillance:The most significant risk with the use of the technology is state surveillance. China’s reported use of facial recognition technologies for surveillance in Xinjiang is such an example. This raises concerns as it might be misused for political purposes.
  • Inaccuracy:Biometrics technology is inaccurate. Evidence shows that the technology is not flawless. For example, the technology has been proven in multiple studies to be inaccurate at identifying people of colour, especially black women.
  • Predatory marketing:Software which analyses biometrics could potentially be put to use by some companies to prey on vulnerable customers. This could be done by segmenting extreme emotions such as distress and tailoring their products and services to these individuals.
  • Stalking:Tools like reverse image searches can provide stalkers with more data about their victims. This is unsafe especially for women, who can be tracked and stalked and maybe assaulted by misusing information obtained.
  • Identity fraud:Criminals who have collected enough personal information on an individual could commit identity fraud. This could have a significant effect on your personal life, including on finances. For example, fake id can be created by exploiting information obtained from persons’ associated biometric information. Crime like photo morphing can threaten the identity of an individual.
  • Dark activities:There is possibility of misusing biometric information for illicit activities and markets like drug selling, weapons etc. By using stolen ids, Aadhar information, it also increases risk for being used in various terrorist activities across the border.

How human rights community has voiced its opinion against FRT?

  • Growing calls for laws in various countries to curb the ill-effects of FRT.
  • Some U.S. lawmakers introduced the Facial Recognition and Biometric Technology Moratorium Act in 2020. Also several states in the U.S. have banned the use of FRT.
  • In the EU, Article 9 of the General Data Protection Regulation (GDPR)prohibits processing of personal biometric data for the purposes of identifying an individual. This provides the much needed protection against FRT infringing on individual’s privacy.

What is the scenario in India?

  • The National Crime Records Bureau in India has requested for proposals to create aNational Automated Facial Recognition System (NAFRS) to build a national database of photographs to identify criminals.
  • Notably India lacks a robust legal framework to address the possible misuse of biometric technology even as the Union Government deployed over a dozen different FRT systems across the country that collect and use biometric data.
  • India still does not have a personal data protection law.The draft Personal Data Protection Bill, 2019 is still under parliamentary scrutiny.

Instances of usage in India

  • The govt used facial recognition technology to track down the protestors who were present at the Red Fort on January 26, 2021
  • UP police is using an AI-based facial recognition system called Trinetra. Police used this software to run surveillance on anti-CAA protestors following which more than 1,100 arrests were made.
  • The Central Board of Secondary Education (CBSE) used facial recognition to match admit card photos on record to match students logging in to give their board exams.
  • The Internet Freedom Foundation (IFF) estimates that there are currently 42 ongoing facial recognition projects in India, front he Automated Multimodal Biometric Identification System (AMBIS) in Maharashtra to FaceTagr in Tamil Nadu. Of these, at least 19 are being developed and deployed by state-level police departments and the NCRB for the specific purpose of security and surveillance.

Implications

The biggest implication is the likely impact on Right to privacy. In Justice K.S. Puttaswamy vs Union of India (2017) Supreme Court recognized right to privacy as a precious fundamental right and provided a three-fold requirement. Accordingly, any encroachment on the right to privacy requires:

  1. The existence of ‘law’(to satisfy legality of action)
  2. There must exist a‘need’, in terms of a ‘legitimate state interest’
  3. The measure adopted must be ‘proportionate’ (there should be a rational nexus between the means adopted and the objective pursued) and ‘least intrusive.’

Unfortunately, NAFRS fails each one of these tests.

  • NAFRS lacks ‘legitimacy’: It does not stem from any statutory enactment (such as the DNA Technology (Use and Application) Regulation Bill 2018 proposed to identify offenders or an executive order of the Central Government. Rather, it was merely approved by the Cabinet Committee on Economic Affairs in 2009.
  • Disproportionate measure: Even if we assume that there exists a need for NAFRS to tackle modern day crimes, this measure is grossly disproportionate. This is because to satisfy the test of ‘proportionality’, benefits for the deployment of this technology have to be sufficiently great, and must outweigh the harm.
    • For NAFRS to achieve the objective of ‘crime prevention’ or ‘identification’ will require the system to track people on a mass-scale, resulting in everyone becoming a subject of surveillance: a disproportionate measure.

Suggestions/Measures

  • Adequate safeguards: Both the Information Technology Act 2000, and the Personal Data Protection Bill 2019 gives the central government unchecked power for the purposes of surveillance. We need adequate safeguards such as penalties so that police personnel are not able to misuse the facial recognition technology.
  • Algorithmic Impact Assessment: Agencies that want to deploy these technologies should be required to carry out a formal algorithmic impact assessment (AIA). Modelled after impact-assessment frameworks for human rights, environmental protection and data protection, AIAs help governments to evaluate artificial-intelligence systems and guarantee public input.
  • Rigorous review: Legislation should be enacted that requires that public agencies rigorously review any facial recognition technologies for bias, privacy and civil-rights concerns.

Way forward

Without accountability and oversight, facial recognition technology has strong potential for misuse and abuse. In the interest of civil liberties and to save democracy from turning authoritarian, it is important to impose a moratorium on the use of facial recognition technology till we have meaningful checks & balances, in addition to statutory authorization of NAFRS and guidelines for deployment.

X

Verifying, please be patient.

Enquire Now