India’s Digital Personal Data Protection Act (DPDPA), enacted in August 2023, marks a pivotal moment for artificial intelligence (AI) adoption across organisations. By embedding privacy into regulatory DNA, it reshapes how institutions design, deploy, and scale AI systems. It also carves the path for building and enhancing the trust ecosystem for AI adoption.
Key implications for AI adoption
Consent-Centric Data Practices
Under the DPDPA, organisations (now “data fiduciaries”) must secure explicit, purpose-specific consent before collecting personal data—a tough challenge for AI models trained on large, diverse datasets. Securing granular consent for each data point dramatically increases operational complexity. It triggers a need to have AI-ready datasets to support the development, training, and adoption of AI models.
Purpose limitation and data minimisation
The law mandates that data collection be limited to necessary and declared purposes. AI’s traditional model of relying on broad, multi-use datasets now requires careful alignment with declared, limited purposes.
Rights of data principals
Governance for significant data fiduciaries (SDFs)
Entities handling high-risk or sensitive data are classified as SDFs and must conduct Data Protection Impact Assessments (DPIAs), independent audits, and establish robust technical safeguards. These structures introduce disciplined governance frameworks that AI developers must integrate.
Data localisation and cross-border rules
The DPDPA may require certain datasets to remain within Indian boundaries, restricting global data flows essential for multi-national AI models. Organisations must structure data localisation strategies or risk non-compliance.
Privacy Considerations for AI Models
Anonymisation and pseudonymisation
To comply with minimisation and purpose limitations, AI systems must employ strong anonymisation techniques. However, incomplete practices risk re-identification, calling for state-of-the-art de-identification methods and rigorous assessment.
Consent management systems
Integrating standardised consent management frameworks, defined under DPDPA rules, is essential—especially for tracking permissions and ensuring efficient rights execution.
Privacy-enhancing technologies (PETs)
Building trust through compliance
(Nitin Shah, Partner – Digital Trust and Head – Cyber Security, Resilience and Privacy GRC, KPMG in India)
(Shikha Kamboj, Partner – Digital Trust and National Leader – Data Privacy and Ethics, KPMG in India)
(Disclaimer: These are the personal opinions of the writer. They do not reflect the views of www.business-standard.com or the Business Standard newspaper)
