🔔This Durga Puja, Invest in your future with our exclusive festive offer. Get up to ₹15,000 off on WBCS ONLINE CLASSROOM PROGRAMME with coupon code Puja15K.

Regulating Speech On Digital Platforms: Challenges and Way Forward

5th September, 2025

Copyright infringement not intended

Picture Courtesy:  THE HINDU

Context

The Supreme Court directed the Union government to frame guidelines for regulating social media, focusing on influencers whose monetized content often harms vulnerable groups.

Supreme Court’s Key Observations and Recommendations

Harmful Commercial Speech

  • Influencers monetize content that mocks groups like persons with disabilities, religious minorities, or Dalits, eroding social harmony.

Inclusivity as a Priority

  • Derogatory content undermines constitutional goals of equality and inclusivity, especially for marginalized groups like Adivasis and LGBTQIA+ communities.

Need for Clear Boundaries

  • Court stressed distinguishing between free speech, commercial speech, and prohibited speech to protect social harmony.

Regulatory Framework

  • Court directed the government to draft guidelines with the National Broadcasters and Digital Association (NBDA), involving diverse stakeholders such as women’s groups, caste-based organizations, and regional activists.

Ethical Accountability

  • Influencers must be educated on digital ethics, and violators should face accountability, including issuing public apologies.

Balancing Rights

  • Court aims to protect free speech under Article 19(1)(a) while safeguarding community rights and dignity in a diverse society.

Read all about: SOCIAL MEDIA REGULATION

Existing Regulatory Framework

Information Technology Act, 2000 (IT Act)

  • Foundational legal framework, defining "intermediaries" (including social media platforms) and providing conditional liability protection under Section 79, provided they exercise due diligence.

Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules 2021)

  • Divided into "social media intermediaries" and "significant social media intermediaries" (those with 5 million or more registered users).
  • Mandate the appointment of a Grievance Redressal Officer (GRO), acknowledgement of complaints within 24 hours and disposal within 15 days.
  • Requires identification of the first originator of information under judicial order.
  • Failure to comply can result in the loss of "safe harbor" protection under Section 79 of the IT Act, making platforms liable for user-generated content.
  • Outlines a code of ethics for platforms, including content classification (U, U/A 7+, U/A 13+, U/A 16+, A) and access control mechanisms (parental locks, age verification).

Bharatiya Nyaya Sanhita (BNS) 

  • Classifying misinformation and hate speech as serious offenses (with penalties up to three years imprisonment), and covering cyberbullying, stalking, and phishing.

Bharatiya Nagarik Suraksha Sanhita (BNSS)

  • Modernizing procedures for social media offenses, allowing electronic trials, digital evidence collection, and mandatory forensic investigation for serious cybercrimes.

Digital Personal Data Protection (DPDP) Act, 2023

  • First comprehensive data protection law that regulates how digital personal data is collected, processed, and stored.

 ASCI Guidelines, 2021

  • Require influencers to disclose material connections (e.g., “Ad,” “Sponsored”) prominently. Both influencers and advertisers share responsibility.

Consumer Protection Act, 2019

  • Holds influencers liable for misleading ads, with fines up to ₹10 lakh for first violations, which can increase to ₹50 lakh for repeat violations.

 Judicial Intervention

Tata Press v/s MTNL (1995): Recognized commercial speech as protected under Article 19(1)(a) if it serves public interest.

Shreya Singhal vs Union of India (2015): Struck down Section 66A of IT Act for vagueness, protecting speech that offends unless it incites harm.

Amish Devgan vs Union of India (2020): Distinguished free speech from hate speech, emphasizing influencers’ responsibility.

Debates in Indian Social Media Regulation

Arguments in Favor of Regulations

Essential for curbing the spread of harmful content linked to violence, communal tensions, and manipulation of public opinion (e.g., during elections).  

To curb misinformation; Fake news, deepfakes, and hate speech threaten social harmony and national security.

  • ASCI (Advertising Standards Council of India) Annual Report, FY24–25 shows 94% of misleading ads were online.

Platforms collect user data without consent, risking privacy violations and misuse.

Unregulated platforms enable cyberbullying and derogatory content, disproportionately affecting minorities, women, and persons with disabilities.

Government's responsibility to protect citizens' interests and maintain law and order.  

  • Article 19(2) allows restrictions for public order, morality, and defamation, necessitating regulation to prevent harm.

To ensure Influencer Accountability; 98% of 1,015 influencer ads violated disclosure norms, with 69% lacking proper disclosure, misleading consumers. (ASCI Annual Report)

Arguments Against Regulations

Requirements like message traceability raise concerns about compromising the constitutional right to privacy and the integrity of end-to-end encryption.

Strict regulations risk stifling free speech, encouraging self-censorship, and undermining democratic principles.

Concerns about potential government overreach and arbitrary content removal (e.g., during protests).

Compliance demands significant resources, deterring innovation, limiting new features, and increasing compliance costs for tech giants and smaller startups.

Imposing legal liability on platforms for user-generated content, restricting diverse viewpoints.

Challenges in Regulating Social Media

 Volume and Anonymity

  • High volume of digital content (94% of misleading ads are online, ASCI, FY24–25) and user anonymity make monitoring difficult.

Subjectivity in Defining Harm

  • Terms like “decency” or “morality” under Article 19(2) vary across cultural contexts, risking inconsistent enforcement or overregulation.

Transparency Gaps

  • Content takedowns under Section 69A lack transparency, eroding user trust. Platforms’ opaque moderation policies exacerbate this issue.

Cross-Border Issues

  • Harmful content often originates outside India, complicating enforcement due to jurisdictional limits of platforms like Meta and YouTube.

Political Bias

  • Moderation decisions face accusations of favoring certain ideologies, especially with Meta India’s alleged ties to the ruling party (India Hate Lab Report, 2024).

Regulatory Fragmentation

Way Forward to enhance Social Media Regulation

Holistic Guidelines

  • Draft comprehensive regulations under the proposed Digital India Act, to ensure clarity on what constitutes harmful content.
  • Protects user rights while effectively addressing challenges of social media.

Digital Literacy Campaigns

  • Launch nationwide programs to educate users on identifying misinformation, cyberbullying, and ethical online behavior.

Technological Solutions

  • Deploy AI-driven tools for real-time content monitoring to flag harmful posts efficiently.
  • Strengthen cyber forensic labs while safeguarding user privacy and encryption standards.

Global Cooperation

  • Cooperate with international regulators and platforms to address cross-border content issues.
  • Negotiate agreements for enforcing Indian laws on foreign-hosted content.

Stricter Influencer Accountability

  • Enforce ASCI’s 2021 disclosure norms rigorously, ensuring labels like “Ad” or “Sponsored” are prominent.
  • Strictly Impose CPA’s ₹10 lakh fines for misleading ads and ban repeat offenders from platforms.

Harmonized Regulatory Framework

  • Streamline roles of ASCI, CCPA, and cybercrime authorities to eliminate overlaps.
  • Create a unified body to oversee digital content regulation, ensuring swift and consistent enforcement.

Strengthening Independent Mechanisms

  • Empowering independent fact-checking units and oversight bodies to ensure unbiased content moderation and challenge arbitrary rules.

Conclusion

Regulating social media requires balancing free speech under Article 19(1)(a) with the dignity of diverse groups under Article 19(2). By addressing the unique needs of religious minorities, LGBTQIA+ communities, women, children, seniors, and rural users, and drawing from global models like South Africa and Indonesia, India can cultivate an inclusive digital ecosystem. 

Source: THE HINDU

PRACTICE QUESTION

Q.  Discuss the challenges in regulating digital media in India and suggest measures to balance free speech with accountability. 150 words

Frequently Asked Questions (FAQs)

Section 79(1) of the IT Act gives intermediaries immunity from liability for third-party content, provided they act with due diligence.

Commercial speech refers to any speech or content, like advertisements or sponsored posts, that has a clear commercial purpose.

Sovereignty, integrity of India, security of the state, friendly relations with foreign states, public order, decency or morality, contempt of court, defamation, or incitement to an offence.

Let's Get In Touch!