DPDPA Sections

DPDPA Section 9: Children's Data

Abhi Anand
1 January 2026
7 min read

Section 9 - Protecting Children in the Digital Age

Section 9 of the DPDPA establishes a dedicated and heightened framework for the processing of personal data belonging to children. In an era where children are active digital participants - using social media, online gaming platforms, educational technology, and streaming services - the risks of unregulated data collection are profound. Children cannot fully comprehend the implications of sharing their personal data, making them particularly vulnerable to exploitation, manipulation, and privacy violations. The DPDPA recognises this vulnerability by imposing stricter obligations on Data Fiduciaries who process children's data. These obligations go beyond the general consent framework of Section 6, requiring verifiable parental consent, prohibiting certain categories of processing outright, and mandating age verification mechanisms. The penalties for non-compliance are severe, reflecting the legislature's intent to create a strong deterrent against the misuse of children's data. Every organisation that offers digital services accessible to children - whether or not those services are specifically designed for children - must understand and implement Section 9's requirements from the date the Act's provisions come into force.

Definition of a Child Under DPDPA

The DPDPA defines a 'child' as any individual who has not completed the age of eighteen years. This threshold is consistent with India's existing legal framework, including the Indian Contract Act, 1872, which sets the age of majority at eighteen. However, it is notably higher than the thresholds in several other jurisdictions - the EU's GDPR allows member states to set the age anywhere between thirteen and sixteen, while the US COPPA applies only to children under thirteen. The eighteen-year threshold means that a significantly larger population falls within Section 9's protective scope in India. Practically, this means that platforms used by teenagers - social media networks, gaming platforms, dating apps, e-commerce sites - must treat all users under eighteen as children for data protection purposes. This has substantial compliance implications, particularly for platforms that previously applied parental consent requirements only for users under thirteen or fourteen. Organisations must review their age gates, registration flows, and consent mechanisms to ensure they capture and protect the full range of users who qualify as children under Indian law.

  • A child is defined as any individual below eighteen years of age
  • This threshold is higher than GDPR (13-16) and US COPPA (under 13)
  • All digital services accessible to minors must comply with Section 9
  • Organisations must update age gates and consent flows to reflect the eighteen-year threshold

Verifiable Parental Consent - The Core Requirement

Section 9(1) mandates that before processing any personal data of a child, a Data Fiduciary must obtain verifiable consent from the child's parent or lawful guardian. This is not ordinary consent - it must be 'verifiable', meaning the Data Fiduciary must take reasonable steps to confirm that the person providing consent is indeed the child's parent or lawful guardian, and that they have actually provided informed consent. The Act does not prescribe specific verification methods, leaving this to be determined by the rules and industry best practices. However, potential mechanisms include requiring the parent to authenticate via government-issued ID (such as Aadhaar or PAN verification), using credit card verification as a proxy for adult status, sending a confirmation to the parent's registered email or mobile number, requiring the parent to sign a consent form digitally, or implementing video verification. The standard is 'reasonable' verification - not absolute certainty. What is reasonable will depend on the context, including the sensitivity of the data being processed, the risks involved, and the age of the child. Processing children's data without valid verifiable parental consent is a violation that can attract penalties of up to two hundred crore rupees under Section 33.

  • Consent must come from a verified parent or lawful guardian - not the child
  • The consent must be verifiable through reasonable mechanisms
  • Potential methods include Aadhaar/PAN verification, credit card checks, digital signatures, or OTP confirmation
  • Processing without valid parental consent attracts penalties up to two hundred crore rupees
  • The verification standard scales with the sensitivity of data and risk to the child

Prohibition on Tracking, Behavioural Monitoring, and Targeted Advertising

Section 9(2) imposes an absolute prohibition on certain categories of processing when directed at children. No Data Fiduciary shall undertake tracking or behavioural monitoring of children, or direct targeted advertising at children. This is one of the most significant provisions in Section 9 because it is a blanket prohibition - not a consent-based restriction. Even with verifiable parental consent, a Data Fiduciary cannot track a child's online behaviour for profiling purposes or direct targeted advertisements at them. This prohibition reflects growing global concern about the impact of behavioural targeting on children's mental health, autonomy, and development. Targeted advertising relies on creating detailed profiles of users' interests, behaviours, and vulnerabilities - practices that are particularly harmful when applied to children who lack the cognitive maturity to recognise or resist manipulative content. For organisations that rely on advertising revenue, this prohibition requires fundamental changes to how they serve ads to users identified as children. Contextual advertising - based on the content being viewed rather than the user's profile - remains permissible. However, any form of personalised or behavioural advertising directed at children is prohibited outright.

  • Tracking and behavioural monitoring of children is absolutely prohibited
  • Targeted advertising directed at children is prohibited regardless of consent
  • Contextual advertising (based on content, not user profile) remains permissible
  • This applies to all platforms accessible to children, not just child-specific services
  • Violations attract severe penalties under Section 33 of the Act

Prohibition on Processing Likely to Cause Detrimental Effect

Section 9(3) provides that no Data Fiduciary shall undertake any processing of personal data that is likely to cause any detrimental effect on the well-being of a child. This is a broader, catch-all provision that extends beyond tracking and advertising. 'Detrimental effect on well-being' is a deliberately wide standard that encompasses physical, psychological, emotional, and developmental harm. This could include processing that exposes children to inappropriate content, processing that enables contact from strangers, processing that facilitates addiction to digital platforms through engagement-maximising algorithms, processing that causes social comparison or body image issues, or processing that subjects children to automated decision-making without human oversight. The assessment of whether processing is 'likely to cause detrimental effect' will require Data Fiduciaries to conduct child-specific impact assessments. These assessments should consider the nature and purpose of the processing, the categories of data involved, the age range of children affected, the potential for harm, and the safeguards in place to mitigate risks. Organisations should document these assessments as evidence of compliance, particularly for processing activities that operate in grey areas where the potential for harm is debatable.

  • Catch-all prohibition on any processing likely to harm a child's well-being
  • Well-being encompasses physical, psychological, emotional, and developmental dimensions
  • Data Fiduciaries should conduct child-specific impact assessments
  • Document assessments as evidence of compliance and due diligence
  • Algorithmic engagement-maximising features directed at children fall within scrutiny

Exemptions for Certain Classes of Data Fiduciaries

Section 9(4) provides that the Central Government may, by notification, exempt certain classes of Data Fiduciaries from the requirements of Section 9(1) (verifiable parental consent) and Section 9(2) (prohibition on tracking and targeted advertising), subject to such conditions as may be prescribed. This exemption power is designed to accommodate specific use cases where strict application of Section 9's requirements would be impractical or counterproductive. The most commonly discussed example is educational technology (edtech) platforms. Requiring verifiable parental consent for every interaction on a school-mandated learning platform could impede access to education. Similarly, healthcare platforms serving children may need to process data without parental consent in emergency situations. However, these exemptions are not automatic - they require an affirmative notification from the Central Government, and they will be subject to conditions designed to ensure that children's interests are still protected. Until such notifications are issued, all Data Fiduciaries must comply with the full requirements of Section 9. Organisations in sectors that anticipate exemptions - edtech, healthcare, child welfare - should monitor government notifications and prepare to comply with any conditions attached to exemptions.

  • The Central Government may exempt certain Data Fiduciary classes from Section 9 requirements
  • Exemptions require an affirmative government notification with prescribed conditions
  • Edtech and healthcare platforms are likely candidates for conditional exemptions
  • Until notifications are issued, full compliance with Section 9 is mandatory
  • Exempted organisations must still comply with any conditions attached to the notification

Age Verification Mechanisms and Implementation Challenges

Implementing Section 9 requires robust age verification mechanisms - but this is one of the most technically and practically challenging aspects of children's data protection globally. The fundamental problem is that there is no universally reliable method to determine a user's age online without collecting additional personal data, which itself raises privacy concerns. India's digital identity infrastructure - particularly Aadhaar - provides a potential pathway that many other jurisdictions lack. Data Fiduciaries could leverage Aadhaar-based age verification through the UIDAI's authentication APIs, providing a high-confidence age signal without collecting additional sensitive information beyond what is necessary. Other approaches include self-declaration (low confidence but low friction), AI-based age estimation using facial analysis (moderate confidence but raises its own privacy concerns), credit card or bank account verification as a proxy for adult status, and school or institutional verification for edtech platforms. The rules under the DPDPA are expected to provide further guidance on acceptable age verification methods. Organisations should implement layered approaches that combine multiple signals to achieve reasonable confidence while minimising friction and data collection. The age verification system itself must be designed with privacy by design principles - collecting only the minimum data needed, not retaining verification data beyond what is necessary, and securing all verification-related data with appropriate safeguards.

  • Age verification is technically challenging with no single perfect solution
  • Aadhaar-based verification offers a high-confidence pathway unique to India
  • Layered approaches combining multiple signals are recommended
  • The age verification system itself must follow privacy by design principles
  • Expected government rules will provide further guidance on acceptable methods

Penalties for Non-Compliance with Section 9

The penalties for violating Section 9 are among the highest in the DPDPA, reflecting the legislature's prioritisation of children's data protection. Under Section 33 read with the Schedule, failure to fulfil additional obligations in respect of children - which includes violations of any provision under Section 9 - can attract a penalty of up to two hundred crore rupees (approximately twenty-four million US dollars). This penalty applies per contravention, meaning an organisation that systematically fails to obtain verifiable parental consent, or that routinely targets children with behavioural advertising, could face multiple penalty assessments. The Data Protection Board will consider factors including the nature and gravity of the breach, whether it was deliberate or negligent, the number of children affected, the duration of the violation, the organisation's efforts to mitigate harm, and any previous contraventions. For organisations processing large volumes of children's data - social media platforms, gaming companies, edtech providers - the cumulative penalty exposure is substantial. Beyond financial penalties, organisations found to have violated children's data protection obligations face significant reputational damage and potential loss of user trust. Parents and guardians are increasingly aware of data privacy issues, and publicised enforcement actions can directly impact user acquisition and retention.

  • Penalties up to two hundred crore rupees per contravention for Section 9 violations
  • The Board considers gravity, intent, scale, duration, and mitigation efforts
  • Systematic violations can result in multiple penalty assessments
  • Reputational damage from children's data violations is particularly severe
  • Organisations should treat Section 9 compliance as a board-level priority

Comparison with Global Children's Data Protection Frameworks

Section 9 of the DPDPA shares common principles with global children's data protection frameworks while incorporating India-specific elements. The US Children's Online Privacy Protection Act (COPPA) applies to children under thirteen and requires verifiable parental consent for data collection by operators of websites and online services directed at children. The EU's GDPR, under Article 8, requires parental consent for information society services offered to children, with member states setting the age threshold between thirteen and sixteen. The UK's Age Appropriate Design Code (Children's Code) goes further by establishing fifteen standards that online services must meet, including defaulting to the highest privacy settings for child users. India's DPDPA is notably stricter in several respects: the eighteen-year threshold is the highest among major jurisdictions, the prohibition on tracking and targeted advertising is absolute rather than consent-based, and the detrimental effect test provides a broad protective standard. However, the DPDPA is less prescriptive than the UK Children's Code in terms of design standards and does not currently mandate privacy by default for child users. As the rules under the DPDPA are developed, alignment with international best practices - particularly around design standards and transparency requirements - would strengthen the framework.

How Kraver.ai Helps

Kraver.ai provides a comprehensive suite of tools designed to help organisations comply with Section 9's children's data protection requirements. Our age verification module integrates with India's Aadhaar-based authentication system and supports multiple verification methods, enabling you to implement layered age gates that balance confidence with user experience. The parental consent management system automates the collection, verification, storage, and withdrawal of verifiable parental consent, maintaining complete audit trails for regulatory defence. Our platform automatically flags processing activities that involve children's data and applies Section 9's restrictions - blocking tracking pixels, behavioural profiling scripts, and targeted advertising delivery for users identified as minors. The child impact assessment module provides structured templates for evaluating whether processing activities are likely to cause detrimental effects on children's well-being, with automated documentation and risk scoring. For organisations seeking exemptions under Section 9(4), Kraver.ai monitors government notifications and helps you implement the conditions attached to any applicable exemptions. Our compliance dashboard provides real-time visibility into your children's data protection posture, alerting you to gaps before they become enforcement risks. Start protecting children's data and achieving Section 9 compliance with Kraver.ai today.

Frequently Asked Questions

Need help with DPDPA compliance?

Kraver.ai automates your compliance journey from start to finish.

Get a Free Assessment