Content moderation through co-regulation – Regulating Social Media Platforms | 9th November 2022 | UPSC Daily Editorial Analysis

Please Share with maximum friends to support the Initiative.

What's the article about?

  • It talks about the co-regulation of social media based content by the Government and Social media platforms together.


  • GS2: Government Policies and Interventions for Development in various sectors and Issues arising out of their Design and Implementation.

What's the crux of the article?

  • Nowadays, social media have become the platform for freedom of speech and expression. Thus it needs to be regulated.
  • Earlier, this regulation was voluntary, and social media platforms were responsible for removing, prioritising, or suspending user accounts that violated the terms and conditions of their platforms.
  • But then the government introduced the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
  • These mandate platforms to establish a grievance redressal mechanism to resolve user complaints within fixed timelines.
  • Recently, the government amended these Rules and established Grievance Appellate Committees (GACs).

What are Grievance Appellate Committees (GACs)?

  • The government will set up one or more centrally appointed GACs within three months.
  • A three-person GAC will consist of a chairperson and two whole-time members appointed by the central government. Of the three, one will be ex-officio, while the other two will be independent members.
  • The idea behind setting up the committees is to give users of social media platforms recourse – other than approaching the courts – to settle complaints.


  • As these large platforms have both positive and negative impacts on democratic rights, a modern intermediary law must re-imagine the role of governments.
  • The modern intermediary law is a method of co-regulation by both the government and platforms.
  • The government has the power to direct platforms to take down harmful content, but at the same time, such orders to remove content should be necessary, proportionate, and comply with due process. For Example, Digital Services Act (DSA) of the EU.
  • At the same, the modern intermediary law must allow platforms to regulate content at the platform level under broad government guidelines.

Digital Services Act (DSA):

  • The Digital Services Act (DSA) is an EU regulation to modernise the e-Commerce Directive regarding illegal content, transparent advertising, and disinformation.
  • As defined by the EU Commission, the DSA is “a set of common rules on intermediaries’ obligations and accountability across the single market”, and ensures higher protection to all EU users, irrespective of their country.

Advantages of co-regulation:

  • platforms will retain reasonable autonomy over their terms of service.
  • co-regulation aligns government and platform interests.
  • instituting co-regulatory mechanisms allows the state to outsource content regulation to platforms, which are better equipped to tackle modern content moderation challenges.

Way Forward:

  • Given the important role of social media platforms in shaping and maintaining the core democratic values, it is necessary to devise a novel method to regulate them. A Digital India Act is expected to be the successor law to the IT Act. This is a perfect opportunity for the government to adopt a co-regulatory model of speech regulation of online speech.

Please Share with maximum friends to support the Initiative.

Enquire now

Give us a call or fill in the form below and we will contact you. We endeavor to answer all inquiries within 24 hours on business days.