AI Series – Part I

AI-Generated Content, Deepfakes and Platform Liability: How Global Platforms Can Navigate India’s New Rules

Date and Time:

Wednesday, 25 March, 2026
(07:00 PM – 08:00 PM IST)

 

INTRODUCTION

The Government of India has introduced amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules”), bringing synthetically generated content within the regulatory framework for the first time.

The amendments, effective from 20 February 2026, impose due diligence obligations on platforms that:

  • Host, or allow users to share or publish AI-generated or AI-modified content; or

  • Deploy AI-powered tools to generate or modify content.

Platforms are required to implement technical measures to detect AI-generated content hosted, require user declarations at the point of upload, prominently label AI-generated content, embed metadata or provenance markers and comply with accelerated takedown timelines for unlawful content.

These developments have important implications for platforms deploying AI-powered tools, including social media platforms, AI content generation tools, marketplaces and other digital services. There is also a conundrum in the application and scope of the due diligence obligations and interplay with safe harbour protection under the Information Technology Act, 2000.

In this webinar, we will unpack the new regulatory framework governing AI-generated content and discuss practical strategies for platforms and AI tool providers to navigate compliance under the amended IT Rules.

Key Discussion Themes:

  • Understanding “Synthetically Generated Information”: Scope of the new definition, the types of AI-generated or AI-modified content covered and evaluating the threshold for when content is “likely to be perceived as indistinguishable from a natural person or a real-world event”.
  • Who is impacted? Applicability to intermediaries, social media platforms and platforms deploying AI content generation or modification tools and potential overreach to non-intermediary platforms as well.
  •  Platform Obligations Under the Amended Rules: Key due diligence obligations including content labelling, metadata embedding, provenance mechanisms and user disclosures.
  • Operationalizing Compliance: Designing platform governance frameworks, content moderation workflows and product design changes to operationalize the labelling and detection requirements.
  • Accelerated Takedown Timelines and Enforcement Risk: Navigating the significantly reduced timelines for responding to takedown requests, which include removal of certain categories of content within 2 hours of receiving a complaint, compliance with lawful takedown requests within 3 hours, and resolution of certain user grievances within 36 hours of receipt.
  • Safe Harbour and Liability Exposure: Understanding how compliance with the amended due diligence obligations interacts with intermediary safe harbour protections.

Flow of the session

07:00 PM to 07:45 PM (IST)

Focused Discussion

07:45 PM to 08:00 PM (IST)

Audience Q&A

Speakers

Rakesh Maheshwari

Former Sr. Director and Group Coordinator (Cyber Law, Cyber Security & Data Governance)

Ministry of Electronics and Information Technology (MeitY)

 

Vaibhav Parikh

Lead – Corporate Transactions and Technology Practice

Nishith Desai Associates

 

Aaron Kamath

Lead - Tech, Digital Media and Commercial Law Practice

Nishith Desai Associates

 

Prerana Reddy

Member, Technology Law Practice

Nishith Desai Associates

 

Tanishq Gupta

Member, Technology Law Practice

Nishith Desai Associates

 

RECOMMENDED READING

AI-Generated Content and Combating Deepfakes: What India’s New Rules Mean for Global Platforms