Introduction
India's ed-tech market has surpassed $10 billion in valuation, making it one of the fastest-growing education technology ecosystems in the world. Platforms such as BYJU'S, Unacademy, Vedantu, PhysicsWallah, and Toppr serve tens of millions of students — the vast majority of whom are under 18. According to Statista, India's online education market is projected to reach $14.3 billion by 2028, driven by 350 million school-age children and rapidly increasing smartphone penetration. Yet this explosive growth has created one of the most significant data protection challenges in the country: ed-tech platforms routinely collect, process, and share children's personal data at a scale that would have been unimaginable a decade ago. The Digital Personal Data Protection Act (DPDPA), 2023 fundamentally changes the rules. Section 9 imposes strict obligations on any Data Fiduciary processing children's data, including verifiable parental consent, prohibitions on behavioural tracking, and penalties of up to ₹200 crore for violations. For an industry built on learning analytics, personalised recommendations, and engagement-driven metrics, compliance is not optional — it is existential.
The Scale of Children's Data in Indian Ed-Tech
To understand the compliance challenge, consider the sheer volume of data that ed-tech platforms process daily. A single student session on platforms like BYJU'S or Unacademy generates dozens of data points: login timestamps, lesson completion rates, quiz scores, time spent on each question, areas of difficulty, video watch patterns, click behaviour, device information, location data, and in-app purchase history. Multiply this by millions of daily active users — the majority being minors — and you have one of the largest repositories of children's personal data in the country. According to a Forbes India investigation, several major ed-tech platforms have been found sharing student data with third-party analytics providers, advertising networks, and even financial services companies — often without explicit parental awareness, let alone consent. The DPDPA framework treats all individuals under 18 as children whose data requires enhanced protection. This is a broader definition than many ed-tech companies anticipated, as it captures not just primary school students but also Class 11 and 12 students, competitive exam aspirants (many of whom are 16-17), and even college freshers who enrolled while still minors.
- Learning Management Systems (LMS) — capture granular academic performance data, learning pace, areas of strength and weakness, and engagement metrics
- Video streaming engines — record watch duration, replay patterns, skip behaviour, and attention metrics that constitute detailed behavioural profiles
- Assessment platforms — collect test scores, answer patterns, time-per-question data, and comparative performance analytics
- In-app communication tools — process chat messages, doubt-clearing interactions, and peer discussion content
- Payment systems — handle parents' financial data including credit card details, UPI IDs, and subscription history linked to the child's account
- Device telemetry — collect device identifiers, operating system data, network information, and location data from children's phones and tablets
Section 9: Verifiable Parental Consent Requirements
The cornerstone of DPDPA compliance for ed-tech platforms is Section 9, which mandates verifiable parental consent before any processing of children's personal data. This is not a standard click-through consent — 'verifiable' means the platform must take reasonable steps to confirm that the person providing consent is actually the child's parent or legal guardian. The Act explicitly prohibits processing that is detrimental to the well-being of the child, and it bans behavioural monitoring and targeted advertising directed at children. For ed-tech platforms, this creates an immediate operational challenge. Most current onboarding flows allow students to sign up with minimal verification — often just an email address or phone number. Under the DPDPA, platforms must redesign these flows entirely to ensure that a verified parent or guardian provides consent before the child's data is processed for any purpose. According to IAPP's operational analysis, verifiable parental consent mechanisms could include OTP verification to a parent's registered mobile number, Aadhaar-based verification of the parent-child relationship, video-based consent verification, or DigiLocker integration for document-based verification.
- Consent must be specific — parents must consent to each distinct purpose of data processing, not a blanket consent covering all uses
- Consent must be informed — the platform must explain in plain language what data is collected, how it will be used, and who it will be shared with
- Consent must be withdrawable — parents must be able to withdraw consent as easily as they gave it, triggering data deletion obligations
- Age verification is mandatory — platforms must implement reliable age-gating mechanisms to identify users under 18
- No detrimental processing — any processing that could harm the child's well-being is prohibited regardless of parental consent
Age-Gating Mechanisms: Implementation Challenges
Effective age-gating is the first line of compliance for ed-tech platforms, yet it remains one of the most technically challenging requirements. The DPDPA requires platforms to make 'reasonable efforts' to determine whether a user is a child, but it does not prescribe specific technical mechanisms — leaving the interpretation to the Data Protection Board of India (DPBI). Current industry practices range from simple self-declaration (selecting a date of birth during registration) to more sophisticated approaches such as AI-based age estimation from selfie images, document verification through DigiLocker or Aadhaar, and cross-referencing with school enrolment databases. According to Yoti's age assurance research, self-declaration alone fails to prevent approximately 30% of underage users from bypassing age gates. Ed-tech platforms face a unique challenge: unlike social media or gaming platforms where users might lie about their age to gain access, ed-tech students often have a legitimate reason to declare their correct age (their course content is age-specific). However, platforms must still verify that users who declare themselves as adults are not actually minors, and that users who declare themselves as minors trigger the appropriate parental consent workflow.
- Self-declaration with verification — collect date of birth during registration and verify against government ID databases
- Parent-initiated onboarding — require parents to create the account first, then add the child as a sub-user
- School-integrated verification — partner with schools to verify student ages through institutional records
- Tiered access — provide limited functionality until age and parental consent are verified, then unlock full features
Learning Analytics as Personal Data Under the DPDPA
One of the most consequential implications of the DPDPA for ed-tech is the classification of learning analytics as personal data. Under the Act, any data that relates to an identified or identifiable individual is personal data — and learning analytics are inherently tied to specific students. A student's performance trajectory, learning pace, areas of difficulty, engagement patterns, and predicted academic outcomes constitute a detailed profile that is both personally identifiable and potentially sensitive. The data classification requirements under the DPDPA mean that ed-tech platforms must inventory every type of learning data they collect and process, map the data flows from collection through processing to storage and sharing, identify the legal basis for each processing activity, and implement appropriate security safeguards based on the sensitivity of the data. Platforms that use machine learning algorithms to generate personalised recommendations, adaptive learning paths, or predictive analytics must recognise that the outputs of these algorithms — the predictions, classifications, and recommendations — are themselves personal data when they relate to a specific student. This means that not just the raw data, but the derived insights, must be protected under the DPDPA framework. According to EY India's DPDP readiness survey, only 23% of ed-tech companies have classified their learning analytics data under any data protection framework.
Behavioural Tracking Restrictions and the Engagement Model
The DPDPA's prohibition on behavioural monitoring of children strikes at the heart of how most ed-tech platforms operate. Modern ed-tech business models are built on engagement metrics: time spent on the platform, lesson completion streaks, gamification elements (points, badges, leaderboards), push notification engagement, and re-engagement campaigns. These features rely on detailed behavioural tracking — recording exactly how a child interacts with the platform and using that data to optimise for increased engagement. Under Section 9, this model faces significant restrictions. Behavioural tracking for the purpose of targeted advertising is explicitly prohibited. But the boundary between 'personalising the learning experience' and 'behavioural monitoring' is blurred. A platform that tracks a student's learning behaviour to recommend the next lesson may be providing a legitimate educational service. The same platform tracking the same behaviour to optimise notification timing for maximum re-engagement is arguably conducting behavioural monitoring. As noted by CookieYes' analysis of the DPDP Rules, the DPBI is expected to issue guidance distinguishing between permissible educational personalisation and prohibited behavioural monitoring. Until that guidance is published, ed-tech platforms should adopt a conservative interpretation: use behavioural data only for direct educational purposes, not for engagement optimisation, advertising, or commercial purposes. Platforms must document the specific educational purpose for each behavioural data processing activity and be prepared to demonstrate that the processing serves the child's educational interests, not the platform's commercial interests.
Third-Party Data Sharing: The Hidden Compliance Risk
The most significant compliance risk for ed-tech platforms often lies not in their own data processing but in their third-party data sharing practices. A typical ed-tech platform shares student data with cloud infrastructure providers, analytics services (Google Analytics, Mixpanel, CleverTap), advertising networks, payment processors, CRM platforms, and increasingly, AI model training pipelines. Each of these third-party relationships constitutes a data processing arrangement under the DPDPA, requiring the ed-tech platform (as the Data Fiduciary) to ensure that every Data Processor has contractual obligations to protect the data appropriately. For children's data, this obligation is even more critical — the platform cannot outsource its compliance responsibilities by sharing data with third parties. According to Entrackr's investigation, several major Indian ed-tech platforms were found sharing student data with third-party analytics and advertising companies without disclosing these relationships in their privacy policies. Under the DPDPA, this practice would trigger the ₹200 crore penalty for children's data violations plus additional penalties for inadequate consent mechanisms.
- Audit all third-party SDKs — identify every third-party SDK embedded in your app and assess what student data each SDK accesses
- Renegotiate vendor contracts — ensure all data processing agreements include DPDPA-specific children's data protections
- Implement data minimisation — share only the minimum data necessary with each third party and anonymise where possible
- Monitor third-party compliance — conduct periodic audits of third-party data handling practices, not just contractual reviews
- Disable advertising SDKs for children — ensure no advertising-related data collection occurs for users identified as children
The ₹200 Crore Penalty: Real Financial Risk for Ed-Tech
The DPDPA's penalty framework imposes a maximum fine of ₹200 crore specifically for violations related to children's data — the second-highest penalty category in the Act. For ed-tech startups, many of which are not yet profitable, a penalty of this magnitude would be existential. Even for well-funded companies, the combination of financial penalties, reputational damage, and potential operational restrictions could fundamentally alter the business trajectory. The penalty schedule is clear: failure to obtain verifiable parental consent, processing children's data in a manner detrimental to the child, conducting behavioural monitoring of children, or sharing children's data without appropriate safeguards all fall under the ₹200 crore maximum. Critically, as highlighted by DPO India, the penalty can be imposed even without an actual data breach — the mere failure to have compliant consent mechanisms or appropriate safeguards is sufficient to trigger enforcement. For publicly traded ed-tech companies, the risk extends beyond the direct penalty to securities law implications, investor confidence, and market capitalisation impact. For privately held companies, a penalty or enforcement action could affect fundraising ability and valuation in subsequent funding rounds.
Practical Compliance Roadmap for Ed-Tech Platforms
Compliance with the DPDPA for ed-tech platforms requires a structured, phased approach that addresses the unique challenges of processing children's data at scale. The DPDPA compliance checklist provides a general framework, but ed-tech platforms must adapt it to their specific data processing realities. The compliance timeline makes it clear that platforms must act now — Phase 2 requirements take effect in November 2026, and full compliance is required by May 2027.
- Phase 1: Data audit and mapping (Weeks 1-4) — conduct a comprehensive inventory of all student data collected, processed, stored, and shared, using automated data discovery tools to identify personal data across all systems
- Phase 2: Consent redesign (Weeks 5-8) — implement verifiable parental consent mechanisms including age-gating, parent verification, and purpose-specific consent collection
- Phase 3: Third-party audit (Weeks 9-12) — audit all third-party data sharing, renegotiate vendor contracts, disable non-essential SDKs for children's accounts, and implement data minimisation
- Phase 4: Behavioural tracking review (Weeks 13-16) — separate educational personalisation from engagement optimisation, document the educational purpose for each tracking activity, and disable prohibited behavioural monitoring
- Phase 5: Rights infrastructure (Weeks 17-20) — build Data Principal rights workflows for parents to access, correct, and delete their children's data
- Phase 6: Breach preparedness (Weeks 21-24) — implement automated breach detection and notification systems, conduct tabletop exercises, and establish incident response protocols
How Kraver.ai Helps Ed-Tech Platforms Achieve Compliance
Kraver.ai's AI-native compliance platform is purpose-built for industries that process children's data at scale. For ed-tech platforms, our solution addresses every critical compliance requirement: automated data discovery identifies student personal data across learning management systems, video platforms, assessment engines, and third-party integrations. Our data classification engine automatically categorises learning analytics, behavioural data, and derived insights under the DPDPA framework. Our consent management module implements verifiable parental consent workflows with age-gating, parent verification, and purpose-specific consent collection — all compliant with Section 9 requirements. Our continuous compliance monitoring tracks data processing activities in real-time, flags behavioural tracking that may violate children's data protections, and alerts compliance teams to third-party data sharing anomalies. And our penalty risk assessment quantifies your exposure under the ₹200 crore children's data penalty category, enabling data-driven prioritisation of remediation efforts.
Conclusion
India's ed-tech industry stands at a compliance crossroads. The platforms that built billion-dollar businesses on engagement-driven models fuelled by student data must now reckon with a regulatory framework that places the child's well-being above commercial interests. The DPDPA's requirements — verifiable parental consent, prohibitions on behavioural monitoring, restrictions on third-party data sharing, and penalties of up to ₹200 crore — are not incremental adjustments; they demand a fundamental rethinking of how ed-tech platforms collect, process, and monetise student data. But compliance is not just a regulatory burden — it is a competitive advantage. Parents are increasingly aware of data privacy concerns, and platforms that demonstrate robust data protection practices will earn greater trust, higher retention, and stronger brand loyalty. The ed-tech companies that move early on DPDPA compliance will not only avoid penalties but will differentiate themselves in a market where trust is becoming the ultimate competitive moat. The compliance deadline is approaching rapidly. Kraver.ai is here to help ed-tech platforms navigate this transition — from data discovery to consent management to continuous compliance monitoring. Start your compliance journey today.