The draft IT (Digital Code) Rules, 2026 seek to regulate online obscenity using Cable TV norms, age classification, and access controls. While protecting users, concerns over subjectivity, outdated standards, and privacy risks persist. A balanced approach needs consultations, nuanced regulation, and lessons from the EU’s Digital Services Act.
Copyright infringement not intended
Picture Courtesy: INDIANEXPRESS
Context
Following a Supreme Court directive, the Ministry of Information and Broadcasting is drafting the Information Technology (Digital Code) Rules, 2026.
|
Read all about: Obscenity Laws & IT Rules Explained l Obscenity law India l Regulating Digital Content l Understanding Obscenity Laws |
What are the key provisions of the draft IT (Digital Code) Rules, 2026?
The draft rules are proposed under Section 87(1) of the Information Technology Act 2000, to balance the fundamental right to freedom of speech and expression (Article 19(1)(a)) with the "reasonable restrictions" permitted under Article 19(2) of the Constitution.
Definition of Obscenity
Adopts standards from the Cable Television Networks Rules, 1994, defining content as obscene if it is lascivious, appeals to prurient interests, or tends to "deprave and corrupt" viewers.
Strict Prohibitions: Digital content is prohibited if it:
Mandatory Age-Based Classification
Requires all digital content to be classified into specific age categories (e.g., U, UA 7+, UA 13+, UA 16+, and A) based on themes like violence, nudity, sex, and horror.
Parental Controls & Verification
Platforms must implement parental locks for content rated 13+ and above, alongside reliable age verification systems for adult-only (A-rated) content.
Expansion of the "Code of Ethics"
Introduces a "list of don’ts," prohibiting content that denigrates women or persons with disabilities, promotes communal violence, or targets children with explicit language.
Synthetic Media Regulation
Intermediaries must label AI-generated or "synthetic" content. This includes embedding permanent metadata identifiers and ensuring visible labels cover at least 10% of the surface area for images or the first 10% of audio duration.
Expanded Due Diligence
Social media platforms are tasked with verifying user declarations regarding synthetic media using automated tools and ensuring such labels cannot be easily suppressed or removed.
Why is regulation of online content necessary?
Constitutional Mandate
To enforce the "reasonable restrictions" under Article 19(2), which allows limits on free speech in the interest of public order, decency, and morality.
Protecting Vulnerable Populations
To shield children and adolescents from harmful and age-inappropriate content, mirroring global efforts like the UK's Online Safety Act 2023.
Curbing Harmful Content
To prevent the spread of hate speech, misinformation, and content that incites violence or communal hatred.
Judicial Directives
The Supreme Court highlighted a legal vacuum where there is little accountability for harmful user-generated content that can damage reputations and cause social harm.
What are the major concerns raised against the draft rules?
Vagueness and Subjectivity
Terms like "good taste" and "decency" are not clearly defined, which could lead to arbitrary enforcement and suppress artistic and political speech.
This echoes the concerns that led the Supreme Court to strike down Section 66A of the IT Act in the landmark Shreya Singhal vs Union of India (2015) case.
Applying Outdated Norms
Applying the 1994 Cable Television Networks Act to internet content is widely debated as inappropriate due to significant technological and behavioral differences between traditional TV and online streaming.
Regulatory Uncertainty
The new rules are seen as a policy reversal from the IT Rules, 2021, which acknowledged the difference between broadcasting and on-demand content. This could deter investment and innovation in the digital media sector.
Privacy Concerns
The mandate for "reliable age verification" for adult content raises privacy issues, conflicting with the fundamental right to privacy established in the Justice K.S. Puttaswamy (2017) judgment.
Failure to Differentiate Content Types
The rules do not distinguish between professionally produced content and user-generated content, which require different regulatory approaches.
|
Content Type |
Description |
Platform Examples |
Appropriate Regulation Focus |
|
Online Curated Content (OCC) |
Professionally produced content with editorial oversight, similar to a publisher. |
Netflix, Amazon Prime Video, Disney+ Hotstar |
Self-regulation, enhanced content descriptors, and effective parental controls. |
|
User-Generated Content (UGC) |
Content created and uploaded by users, with platforms acting as intermediaries. |
YouTube, Instagram, Facebook, X |
Robust grievance redressal mechanisms and transparent content moderation systems. |
Way Forward to create a balanced and effective framework
Meaningful Stakeholder Consultation
Engage with content creators, platforms, civil society, and technical experts to ensure the rules are practical and protect fundamental rights.
A Differentiated Regulatory Approach
Create separate rules for Online Curated Content (OCC) focusing on self-regulation and for User-Generated Content (UGC) focusing on strong moderation and grievance redressal systems.
Precision in Legal Definitions
Replace vague terms with clear, objective, and narrowly-defined criteria to prevent misuse and ensure any restriction on speech is directly linked to the grounds in Article 19(2).
Adopt a Co-regulatory Model
Combine industry self-regulation with oversight from an independent body, rather than direct state control. Promote privacy-preserving technologies for age verification.
Promote Digital Literacy
Invest in nationwide campaigns to empower citizens, especially parents and children, to navigate the internet safely and responsibly.
Learn Lessons from Global Best Practices
Conclusion
The proposed IT (Digital Code) Rules, 2026, need to shift from broad content policing to a strategy that respects digital rights. By adopting global best practices and using precise definitions, the government can create a safer digital space while protecting freedom of expression and innovation.
Source: INDIAN EXPRESS
|
PRACTICE QUESTION Q. Discuss the challenges posed by User-Generated Content (UGC) and deepfakes in maintaining public order and morality in the digital age. (250 words) |
The Draft IT (Digital Code) Rules, 2026, is a proposed legal framework by the Indian government to regulate online content, particularly focusing on obscenity. It aims to define permissible content, mandate age-based classification, and implement access control mechanisms for platforms.
The government's initiative is prompted by a Supreme Court directive to balance freedom of speech (Article 19(1)(a)) with reasonable restrictions (Article 19(2)). The primary goals are to protect vulnerable populations like children from harmful content, curb the spread of hate speech and misinformation, and fill a legal gap regarding accountability for harmful user-generated content.
The rules propose a mandatory classification system where all digital content must be labeled with an age-suitability rating. The proposed ratings are ‘U’ (Universal), 7+, 13+, 16+, and ‘A’ (Adult-only). Content must also carry labels indicating themes like violence, nudity, sex, and drugs.
© 2026 iasgyan. All right reserved