What's New :
Open Session on IAS Mains 2025. Register Here

Regulation of Web Content

Published: 16th Mar, 2020

Recently, the Ministry of Electronics and Information Technology (MeitY), has prepared the revised Information Technology (Intermediary Guidelines) Rules 2018 to replace the rules notified in 2011 which will be in force after law ministry’s approval.

Context

Recently, the Ministry of Electronics and Information Technology (MeitY), has prepared the revised Information Technology (Intermediary Guidelines) Rules 2018 to replace the rules notified in 2011 which will be in force after law ministry’s approval.

Background

  • The Information Technology Act (IT Act), 2000 was enacted to give a fillip to electronic transactions, to provide legal recognition for e-commerce and e-transactions, to facilitate e-governance, to prevent computer-based crimes and ensure security practices and procedures. The Act came into force on 17th October 2000.
  • Section 79 of the IT Act elaborates on the exemption from liabilities of intermediaries in certain cases. Section 79(2)(c) mentions that intermediaries must observe due diligence while discharging their duties, and also observe such other guidelines as prescribed by the Central Government. Accordingly, the Information Technology (Intermediaries Guidelines) Rules, 2011 were notified in April 2011.
  • A calling attention motion on “Misuse of Socal Media platforms and spreading of fake news” was admitted in the Parliament (Rajya Sabha) in 2018 (Monsoon session). Hon’ble Minister for Electronics and IT, responding to the calling attention motion made a detailed statement where he inter alia conveyed to the House the resolve of the Government to strengthen the legal framework and make the social media platforms accountable under the law.
  • Subsequently, MeitY has prepared the draft Information Technology (Intermediary Guidelines) Rules 2018 to replace the rules notified in 2011.
  • MeitY will seek the law ministry's views on the validity of the provisions in the draft document before notifying the rules. MeitY wants to make sure that the provisions in the draft do not overshoot the due diligence required under the larger IT Act.
  • The guidelines propose additional responsibilities on social media companies. These include verifying users through mobile numbers, tracing the origin of messages required by court order and building automated tools to identify child pornography and terror-related content.

All these requirements come under the ambit of under due diligence.

Analysis

Concerns with the Proposed Guidelines

  • The guidelines have overarching rules for all the intermediaries operating in India, such as a proper privacy policy and updating users about it regularly
  •  Social media companies, as well as intermediaries, will have to operate with a certain degree of responsibility.
  • Each clause in the guidelines is being scrutinized as these rules will set a “global precedent.
  • The government’s move to classify intermediaries based on function better reflects the modern nature of the internet and will help ensure that the open internet remains a space for innovation and knowledge sharing.
  • The traceability and 24-hour content takedown timelines continue to threaten the freedom of expression, privacy and security of users and should be urgently reformed before the rules are enacted.
  • The central criticism against the proposed amendment is that it contravenes a landmark Supreme Court judgement. In the 2015 Shreya Singhal writ case on online freedom of speech, the court clearly stated that online content could be removed from intermediary platforms only by government or court order. This protected the platforms from liability and served as a brake on frivolous or agenda-driven take-down demands.
  • Tech giants and security experts have joined the free speech lobby in opposing the liability regime on account of the technical problems. They complain that many of the proposals would be impossible to implement and ought to be dropped, such as the use of automated tools to proactively indentify and remove unlawful content.

World Wide Scenario

  • India is not the only country to go in for such measures. 
  • Germany has enforced strict laws, called NetzDG, that call for rapid removal of hate speech and other toxic content or be hit with fines amounting to €50 million. 
  • NetzDG, described as “the most ambitious attempt by a Western state to hold social media platforms responsible” for dangerous online speech, is serving as the template for other nations — first Russia and now India has copied sections of the German law.
  • Australia’s crackdown on social media companies began much earlier in 2015 and entails huge fines for intermediaries, while the UK has put in place stringent rules to combat specified illegal content, such as child abuse and terrorism.
  • The strong democracies have appointed strong regulators to oversee the implementation of the privatised law enforcement. Australia has an eSafety Commissioner while the UK has selected Ofcom, the telecom regulator, as the Internet watchdog and is equipping it with the necessary powers to enforce the new “duty of care” laws.

Rationale for Revised Guidelines

  • The frequent shutdowns in India, according to trackers, started in 2015 and it has had severe economic repercussions. 
  • According to Internet research firm Top10VPN the Internet shutdowns cost India over $1.3 billion, the third-highest after Iraq’s and Sudan’s, the latter two are war-torn nations.
  • Curb hate speech
  • Prevent fake news and communal violence

Amendments are lethal to free speech

  • Traceability will undermine security for all users, lead to surveillance
    • Intermediaries to ensure “traceability” of messages by providing information on the originator and receivers of messages. Platforms will have to break end-to-end encryption or install a back-door and make all users vulnerable. It is an attack on the fundamental right to privacy
  • Automated filtering technology will result in censorship and choke free speech
    • Intermediaries to proactively monitor, delete “unlawful content” through automated tools. These will facilitate pre-censorship by suppressing free speech before it becomes public. Existing filters used by social media platforms are already notorious for taking down harmless content. It is against Supreme Court orders.
  • Takedown of content within short timelines a major challenge to free speech
    • Intermediaries to take down illegal content in 24 hours and share information with the government within 72 hours. This is not enough time to analyse requests, seek clarifications or remedies. It will create a perverse incentive to takedown content and share user data without due process.
  • Data retention antithetical to privacy
    • Intermediaries must preserve content requested by law enforcement for at least 180 days. It contradicts the principle of “storage limitation” recommended by the Srikrishna Committee

Conclusion

The overwhelming question that policymakers have to answer is this: Will filtering end targeted online attacks and disinformation campaigns against the most vulnerable communities in India? Computational propaganda campaigns are now a core strategy for political campaigns the world over. The political polarisation we see online mirrors the state of things offline.

X

Verifying, please be patient.

Enquire Now