Advertisement
Advertisement

― Advertisement ―

HomeIT Rules 2026: AI Content & Platform Liability

IT Rules 2026: AI Content & Platform Liability

ADVERTISEMENT

The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 were issued by the Central Government under Section 87 of the Information Technology Act, 2000 and superseded the 2011 intermediary guidelines. These rules establish a comprehensive statutory framework governing intermediaries, publishers of digital media, and online gaming platforms.

The updated version as of 10 February 2026 introduces significant regulatory changes, particularly in relation to synthetically generated information, platform accountability, and accelerated content removal timelines. The rules apply to social media intermediaries, significant social media intermediaries, publishers of news and current affairs content, publishers of online curated content, and online gaming intermediaries operating within India or providing services accessible in India.

SPONSORED

The framework operates in conjunction with Section 79 of The Information Technology Act, 2000, which grants conditional safe harbour protection to intermediaries. The Rules prescribe mandatory due diligence obligations, and failure to comply results in the intermediary losing the statutory immunity and becoming liable under applicable law including penal provisions. The 2026 amendments further clarify that synthetically generated information and AI-generated content fall within the scope of “information” used to commit unlawful acts, thereby expanding intermediary liability for emerging technologies.

Statutory Basis and Scope of the 2026 Framework

The Rules define digital media as digitised content transmitted over computer networks and processed by intermediaries or publishers, including news portals, aggregators, and online curated platforms. The 2026 amendment introduces the definition of “synthetically generated information”, which refers to audio, visual, or audio-visual content artificially created or modified using computer resources in a manner that appears authentic or real.

This definition specifically captures AI-generated content, deepfakes, and algorithmically manipulated media. The Rules also clarify that routine editing, formatting, translation, or accessibility enhancements that do not materially alter meaning will not qualify as synthetically generated information.

The Rules further define key regulated entities including social media intermediaries, significant social media intermediaries based on user thresholds, publishers of online curated content, publishers of news and current affairs content, and online gaming intermediaries. The 2026 amendments also incorporate definitions relating to online games, permissible online real money games, online gaming self-regulatory bodies, and permissible online games verified under the regulatory framework. These definitions expand the regulatory perimeter of the IT Rules to include digital gaming platforms and AI-based content ecosystems.

Due Diligence Obligations Under Rule 3

Rule 3 imposes statutory due diligence requirements on intermediaries while discharging their duties under the Information Technology Act. Intermediaries must prominently publish their rules, privacy policies, and user agreements in English or in any language specified in the Eighth Schedule to the Constitution, allowing users to access the terms in their preferred language.

The intermediary is required to inform users not to host, upload, publish, transmit, or share content that is unlawful, misleading, harmful to children, infringes intellectual property rights, or threatens public order, sovereignty, and security of India. The Rules specifically include misinformation and content identified as fake or misleading by authorised government fact-checking units.

The 2026 amendment strengthens intermediary obligations by requiring periodic communication to users regarding consequences of non-compliance, including suspension of accounts and potential liability under applicable laws. The Rules also require intermediaries to preserve removed content and associated records for at least 180 days for investigative purposes and to provide information to authorised government agencies within seventy-two hours, or within shorter timelines for specific categories such as online gaming intermediaries.

Intermediaries must also adopt reasonable security practices under The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Information) Rules, 2011.

2026 Amendment: Regulation of Synthetically Generated Information

The 2026 amendment introduces a new due diligence framework specifically addressing synthetically generated information. Where intermediaries provide tools enabling creation or dissemination of such information, they must deploy reasonable and appropriate technical measures to prevent users from generating unlawful synthetic content.

The Rules prohibit synthetic content that creates child sexual abuse material, non-consensual intimate imagery, false documents, deceptive impersonation, or content facilitating illegal activities including procurement of arms or explosives. The intermediary must label synthetic content clearly and prominently so that users can identify that the content has been artificially generated or altered.

The amendment also mandates embedding permanent metadata or technical provenance identifiers within synthetic content, to the extent technically feasible. This requirement ensures traceability of AI-generated content and prevents misuse. Intermediaries are prohibited from enabling removal or suppression of such labels or metadata.

Where unlawful synthetic content is identified, intermediaries must take expeditious action including removal, disabling access, suspension of user accounts, and disclosure of violating user identity where legally required. These provisions create a statutory regulatory structure for deepfakes and AI-generated media.

Expedited Takedown Requirement: Three-Hour Compliance

The 2026 amendment revises Rule 3(1)(d) and significantly shortens the content removal timeline. Intermediaries must remove or disable access to unlawful information within three hours upon receiving actual knowledge. Such knowledge may arise through an order of a court of competent jurisdiction or through a reasoned written intimation issued by an authorised government officer not below the rank prescribed in the Rules. The direction must specify the legal basis, statutory provision invoked, and the precise electronic location of the content requiring removal.

The amendment also provides for periodic review of such takedown directions by senior government officials to ensure necessity and proportionality. This provision reflects a shift toward real-time content moderation obligations and significantly increases compliance burdens on intermediaries. Failure to comply within the prescribed three-hour period may result in loss of safe harbour protection and potential liability under applicable laws.

Grievance Redressal Framework and User Protection

Rule 3(2) establishes a statutory grievance redressal mechanism. Intermediaries must publish the name and contact details of the grievance officer and provide a mechanism for complaints. The grievance officer must acknowledge complaints within twenty-four hours and resolve them within seven days. Complaints relating to removal of unlawful content must be addressed within thirty-six hours. In cases involving nudity, impersonation, or sexual content, intermediaries must remove or disable access within two hours of receiving a complaint from the affected individual.

The Rules also require intermediaries to implement a complaint tracking mechanism and provide reasons for action taken. The grievance framework extends to violations involving synthetically generated information, and intermediaries must take appropriate action including suspension of accounts and disclosure of identity where legally permissible. These provisions create enforceable user rights against intermediaries and establish structured accountability mechanisms.

Grievance Appellate Committee (GAC)

Rule 3A introduces Grievance Appellate Committees constituted by the Central Government. Any person aggrieved by the decision of the grievance officer or by failure to resolve a complaint within prescribed timelines may file an appeal within thirty days.

The Committee is required to resolve appeals within thirty calendar days and may conduct proceedings through an online dispute resolution mechanism. Orders of the Committee are binding on intermediaries, who must comply and upload compliance reports on their websites. This appellate structure introduces an additional layer of statutory oversight over intermediary decisions.

Additional Obligations for Significant Social Media Intermediaries

Rule 4 imposes enhanced obligations on significant social media intermediaries. These intermediaries must appoint a Chief Compliance Officer responsible for ensuring compliance with the Act and Rules. They must also appoint a nodal contact person for coordination with law enforcement agencies and a Resident Grievance Officer for user complaints. Significant intermediaries are required to publish monthly compliance reports detailing complaints received and action taken.

The Rules further require significant intermediaries providing messaging services to enable identification of the first originator of information upon lawful order issued under applicable legal provisions. Such orders may be issued for offences relating to sovereignty, security of the State, public order, rape, or child sexual abuse material. The Rules clarify that intermediaries are not required to disclose message contents while complying with traceability requirements.

Mandatory AI Disclosure Requirement for Platforms (2026)

The 2026 amendment inserts additional obligations requiring significant social media intermediaries to ensure disclosure of synthetically generated information prior to publication. Users must declare whether content is synthetically generated.

Intermediaries must deploy technical measures to verify such declarations and prominently label the content where synthetic generation is confirmed. Where intermediaries knowingly permit unlabeled synthetic content, they are deemed to have failed due diligence under the Rules. This provision directly regulates AI-generated content dissemination on social media platforms.

Online Gaming Regulation Framework

The Rules introduce a regulatory regime for online gaming intermediaries, particularly online real money games. Online gaming self-regulatory bodies designated by the Ministry may verify games as permissible online real money games. Such bodies must be Section 8 companies with independent governance and expertise in gaming, public policy, technology, and user protection. Verified games must not involve wagering and must comply with due diligence requirements.

Online gaming intermediaries must verify user identity before accepting deposits, disclose withdrawal policies, and provide safeguards against financial loss and addiction. The Rules prohibit financing of gaming participation through credit. Verified games must display visible verification marks and self-regulatory bodies must maintain updated lists of permissible games. These provisions create a structured regulatory framework for real money gaming platforms.

Digital Media Code of Ethics

Part III of the Rules applies to publishers of news and current affairs content and publishers of online curated content. The Rules mandate observance of a Code of Ethics and establish a three-tier grievance redressal mechanism consisting of self-regulation by publishers, self-regulation by industry bodies, and oversight by the Central Government. Publishers must appoint grievance officers based in India, acknowledge complaints within twenty-four hours, and decide complaints within fifteen days. Appeals may be filed before self-regulatory bodies and thereafter before the government oversight mechanism.

The Rules apply to publishers operating in India or conducting systematic business activity making content available in India. These provisions extend regulatory supervision to digital news platforms and OTT services.

Loss of Safe Harbour Protection

Rule 7 provides that failure to observe due diligence obligations results in loss of safe harbour under Section 79 of the Information Technology Act. In such cases, intermediaries may be liable for punishment under applicable law including The Bharatiya Nyaya Sanhita, 2023. The Rules therefore convert compliance obligations into enforceable statutory duties.

Conclusion

The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 as updated in 2026 create an expanded regulatory framework governing intermediaries, AI-generated content, online gaming platforms, and digital media publishers. The introduction of synthetically generated information regulation, three-hour takedown requirement, strengthened grievance mechanisms, and enhanced due diligence obligations significantly increase platform accountability.

The Rules operate as a statutory compliance regime under the Information Technology Act, 2000 and failure to comply removes safe harbour protection, exposing intermediaries to civil and criminal liability under Indian law.

The expanded due-diligence and AI liability framework aligns closely with MeitY and Statutory Data Protection Enforcement, reinforcing platform accountability under the updated IT Rules 2026.



Source link