On February 10, 2026, the Ministry of Electronics and Information Technology (MeitY) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026. These amendments expand India’s intermediary regulatory framework, particularly to address risks arising from synthetically generated information (SGI) including AI-generated and manipulated multimedia content and impose stronger compliance mandates on digital intermediaries and shall come into force w.e.f., 20th February 2026
Applicability: Who Do the 2026 Rules Affect?
The amended Rules apply to intermediaries — broadly defined to include social media platforms, messaging services, online gaming intermediaries, and significant social media intermediaries (SSMIs). These entities facilitate the hosting, publishing or distribution of third-party content and must therefore comply with the amended due diligence and content moderation obligations.
Highlights of the Amendment:
1. Regulatory Recognition of Synthetic Content
For the first time, the Rules explicitly define “synthetically generated information” in Rule 2(wa), as AI-created or AI-altered audio, visual, or audio-visual content that appears real or indistinguishable from genuine media.
- This formal recognition makes such content subject to compliance rather than treated as an informal or policy matter.
- Neutral or benign edits (e.g., accessibility improvements or simple enhancements) are exempt where done in good faith.
2. Mandatory Labelling & Metadata Requirements
Intermediaries must ensure that synthetic or AI-generated content is visibly labelled and carries embedded metadata (Rule 3(ii)) clearly indicating its synthetic origin.
- SSMIs must attempt to verify user declarations about content being SGI and ensure labelling is done before publication.
3. Shortened Takedown & Grievance Timelines
One of the most impactful changes is the drastic reduction of timelines for platform action:
- Platforms must generally comply with takedown or disabling notices within 3 hours, down from the earlier 24–36-hour window.
- Sensitive content such as impersonation, deep fakes and non-consensual imagery may require action in as little as 2 hours.
- Grievance redressal timelines have dropped from 15 days to 7 days.
4. Expanded Due Diligence & Quarterly User Notices
Intermediaries are now required to periodically (at least every three months) inform users in clear language about:
- Consequences of breaking platform rules;
- Legal liabilities under Indian law; and
- Mandatory reporting of specific unlawful content to authorities where applicable.
5. Safe Harbour & Liability Protection
Compliance with the amended rules is tied to platforms retaining safe harbour protection under the IT Act. If intermediaries fail to meet their obligations — including labelling, verification and timely takedown — they risk losing this immunity from liability for user-generated content.
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 represent a decisive and forward-looking effort by the Indian government to bring emerging digital harms — particularly AI-generated deep fakes and manipulated media — within a legal compliance framework. By introducing explicit definitions, stricter labelling, and faster takedown requirements, the Rules aim to curb misinformation, privacy violations and impersonation, and to increase accountability among digital intermediaries.
Disclaimer: This is an effort by Lexcomply.com, to contribute towards improving compliance management regime. User is advised not to construe this service as legal opinion and is advisable to take a view of subject experts.