India’s Digital Personal Data Protection (DPDP) regime has moved from “law on paper” to an operational compliance roadmap. The government notified the DPDP Rules, 2025 on 14 November 2025, alongside an implementation approach that staggers obligations over an 18-month window rather than forcing an immediate, all-at-once switch.
For AI-heavy businesses, the practical implication is clear: DPDP is not an “AI law,” but it directly governs the personal-data supply chains that power recommendation engines, behavioural analytics, and model training pipelines.
1) Implementation Snapshot: What Is Live, What’s Next
Multiple rollout explainers map implementation into phases: Phase 1 (mid-Nov 2025) for institutional setup and foundational operationalisation, Phase 2 (Nov 2026) focused on Consent Manager registration/oversight, and Phase 3 (May 2027) when broader obligations are expected to become fully effective.
Importantly, the government has also signalled flexibility: Electronics and IT Minister Ashwini Vaishnaw has said the Centre is open to shortening the 18-month compliance timeline, subject to stakeholder discussions.
2) What DPDP Changes for AI: Lawful Basis, Purpose, and Retention
DPDP applies to digital personal data and builds enforceable expectations around: (a) lawful processing, (b) clear purpose specification, (c) retention discipline, and (d) security safeguards. The Act also requires breach intimation to the Board and affected individuals in the prescribed manner.
For AI programmes, this elevates “data strategy” into a compliance artefact. Personal data used to train or fine-tune models must be governed through documented purpose, appropriate notice/consent (or other permitted basis where applicable), and defensible retention periods—especially for logs and behavioural datasets that tend to sprawl.
3) Model Training Debate: Innovation vs Feasibility
A key policy tension is whether AI developers can use publicly available personal data at scale without incurring an unmanageable compliance burden. The Internet and Mobile Association of India (IAMAI) has urged the government to consider temporary exemptions/relaxations when personal data is processed solely for training or fine-tuning AI models, citing ambiguity and operational impracticality.
This debate matters for India’s AI ecosystem because it influences how companies design dataset acquisition, filtering, provenance documentation, and downstream risk controls.
4) Significant Data Fiduciaries: Higher Governance Expectations
DPDP introduces a category of Significant Data Fiduciaries (SDFs), who may be subject to additional governance expectations such as DPO appointment, independent audits, and periodic impact assessments—requirements that are particularly relevant for large AI platforms operating at scale.
5) Compliance Playbook for AI Teams: What to Build Now
AI organisations can treat DPDP readiness as a build plan:
- Map data flows (collection → storage → training → inference → logging → sharing).
- Separate personal vs non-personal datasets and apply minimisation.
- Stand up consent/notice operations with audit trails (and prepare for Consent Manager interoperability).
- Implement security safeguards and incident response consistent with breach-intimation duties.
- Prepare to service access/erasure-type requests; the Act ties erasure to consent withdrawal unless retention is required by law.
6) Cross-Border Transfers: Monitor the “Negative List” Model
DPDP adopts a government-notified restriction approach for cross-border transfers—often described as a “negative list” framework—meaning transfers may be permitted unless the government restricts specified countries/territories/entities by notification.
Why This Matters: Penalties and Trust
DPDP introduces material financial exposure. The official framework highlights penalties up to ₹250 crore for failure to maintain reasonable security safeguards, with other large penalties for breach-related and child-data obligations.
For AI businesses, early compliance is not only risk mitigation—it is also a trust signal to users, enterprise customers, and regulators in a market moving quickly toward “privacy-by-design” expectations.
By – Charu Mandhyan

