
The Ministry of Electronics and Information Technology (MeitY) released the draft for Digital Personal Data Protection (DPDP) Rules, 2025 in January, inviting public feedback by March 5. In line with this initiative, MeitY hosted a consultative session on 18 February in New Delhi, which saw participation from key stakeholders, including the Observer Research Foundation (ORF). ORF actively engaged in the discussions and later submitted detailed recommendations.
The DPDP Rules aim to operationalise the Digital Personal Data Protection Act, 2023, which balances individuals' rights to data privacy with the need for lawful data processing. The draft rules focus on clarifying procedures, accountability, and safeguards essential for implementing India’s digital data governance framework. But ORF raises following concerns in the DPDP Rules :
I. Registration and Obligations of Consent Manager
The Digital Personal Data Protection (DPDP) Rules, 2025 outline the framework for the registration and duties of Consent Managers under Rule 4 and the First Schedule. Consent Managers are required to facilitate seamless consent collection, review, and withdrawal for Data Principals, while ensuring robust data security and accurate record-keeping.
However, the criteria for registration—such as “sufficient capacity,” “general character of management,” and “reputation of key personnel”—remain vague, granting wide discretion to the Data Protection Board without clear guidelines. The Rules also mandate independent certification for technical compliance and interoperability but fail to specify who conducts it or what standards apply. Furthermore, terms like “reasonable security safeguards” lack definition, raising concerns about clarity and implementation.
2. Verifiable Consent for Processing of Personal Data of a Child or of Person with Disability Who Has Lawful Guardian
Section 9 of the DPDP Act mandates that Data Fiduciaries obtain verifiable consent from parents or lawful guardians for processing the personal data of children or persons with disabilities. Rule 10 elaborates on this process. While intended to protect, these measures raise concerns.
For children, especially in rural households with shared devices, age verification and consent become difficult. Children may self-declare their age inaccurately, creating loopholes and potentially widening the digital divide. Requiring government-issued identity for consent introduces privacy risks for both children and parents.
For persons with disabilities, the rules fail to distinguish between substituted and supported decision-making models. While the RPwD Act supports autonomy, the rules imply a one-size-fits-all approach, potentially undermining individuals’ capacity for independent consent. To ensure genuine inclusion, the Rules must better reflect the diversity of abilities, clarify consent mechanisms, and safeguard individual autonomy and privacy.
3. Localisation of Certain Kinds of Data
Section 10 of the DPDP Act empowers the Central Government to restrict cross-border transfer of personal data by notifying specific countries. While the Act supports free data flow unless a country is blacklisted, the Draft Rules exceed this mandate. Rule 12 imposes additional data localisation requirements on Significant Data Fiduciaries (SDFs), mandating that certain personal and traffic data be processed only within India. This oversteps the Act’s scope, violating principles of delegated legislation, which cannot override or extend the parent law. The rule lacks clarity on the composition and authority of the recommending committee, raising concerns of arbitrary implementation. Moreover, forced data localisation could increase compliance costs, burden MSMEs, deter foreign investment, and hinder innovation in data-driven sectors like AI and cloud computing. The absence of clear guidelines and categories of restricted data may also create business uncertainty and weaken trust in India’s data protection regime.
4. Due Diligence for Algorithmic Software
Rule 12 mandates that Significant Data Fiduciaries (SDFs) ensure algorithmic software used for processing personal data does not risk Data Principals’ rights. However, the rule lacks clarity on what constitutes “due diligence,” leaving SDFs uncertain about compliance standards. Additionally, the term “algorithmic software” is overly broad, potentially covering all digital tools, including AI systems. This ambiguity calls for clearer definitions and guidance to avoid overreach and ensure effective implementation.
5. Functioning of Board as Digital Office
Section 28 of the Act states that the Data Protection Board will operate as a digital office, using prescribed techno-legal measures. While Section 2(m) defines a digital office as one where all proceedings—from complaints to appeals—occur online, the Act expects the Rules to detail the required measures. However, the Rules merely state that proceedings may occur without physical presence, offering no clarity on the specific techno-legal tools or safeguards the Board must implement.
Also Read: Explanatory note to Digital Personal Data Protection Rules, 2025
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.