DPDP children data India obligations are the most commonly misunderstood part of the Digital Personal Data Protection Act 2023 for EdTech platforms, and most Indian learning apps are not yet prepared to meet them. If your learning app, school management system, or test-prep platform serves users under 18, you cannot rely on standard consent flows. You need verifiable parental consent, a ban on tracking and behavioural monitoring, and a data-deletion protocol that survives platform updates. This guide is the sector-specific compliance reference we use with EdTech clients in Bangalore, Hyderabad, and Delhi-NCR.
The article moves top-down: what the Act says about children’s data, who qualifies as a child under Indian law, the five technical and procedural controls you need, and common implementation failures we see in the field.
What the DPDP Act says about children’s data
Section 2(d) defines a child as a Data Principal who has not completed 18 years of age. Section 9 imposes specific obligations on Data Fiduciaries processing children’s personal data:
- Verifiable parental consent is required before processing
- No tracking, behavioural monitoring, or targeted advertising directed at children
- No processing that is likely to cause detriment to the well-being of the child
- The Central Government may exempt certain educational or counselling institutions by notification
The Act does not distinguish between K-12 and higher education. A 17-year-old NEET aspirant and a 12-year-old primary student receive the same protection.
Who is affected: EdTech personas
Online learning platforms (K-12 and test prep)
Apps that collect name, age, school, phone number, and payment data from parents. The child is the user; the parent is the consent-giver.
School management systems (ERP / LMS)
Platforms that store attendance, grades, health records, and biometric data (Aadhaar-linked or otherwise). The school is the Data Fiduciary; the platform is often the processor.
Adaptive learning and AI tutoring
Systems that track time-on-task, error patterns, and learning velocity. These are precisely the “behavioural monitoring” activities the Act prohibits for children.
Social features in learning apps
Leaderboards, friend connections, and user-generated content. Each feature may process personal data and requires consent scrutiny.
The five controls for EdTech DPDP compliance
Control 1: Age-gating with identity verification
You must determine whether a user is a child before collecting data. Common approaches:
- Self-declared date of birth with email verification
- Parental email confirmation loop
- Third-party age-verification services (emerging in India)
The verification must be verifiable — a checkbox saying “I am over 18” is not sufficient.
Control 2: Parental consent workflow
The consent must be:
- Separate from terms of service
- In plain language (English + regional language)
- Granular per feature (data collection, notifications, social features)
- Revocable as easily as it was given
- Recorded with timestamp and IP address
Control 3: No tracking or behavioural monitoring
Disable or re-engineer:
- Behavioural advertising (even contextual ads must be reviewed)
- Third-party analytics that profile children
- AI-driven personalisation that builds a behavioural profile
- Retargeting pixels on any page accessed by children
Control 4: Data minimisation and deletion
Collect only what is necessary for the educational purpose. Implement:
- Automatic account deletion when the child reaches 18
- Parent-initiated deletion workflow
- Annual data-retention review
Control 5: Processor governance
If you use cloud hosting, payment gateways, or AI APIs, each processor must:
- Sign a DPA with children’s data clauses
- Certify that they do not further process the data for advertising or profiling
- Allow audit rights specific to children’s data handling
Common EdTech DPDP mistakes
- Assuming parental payment equals parental consent. Payment authorisation is not the same as data-processing consent.
- Using standard adult consent flows for children. The Act requires a higher standard. Redesign the flow.
- Ignoring adaptive learning as behavioural monitoring. If the system profiles the child, it is likely prohibited.
- Retaining data indefinitely. The Act requires data deletion when the purpose is complete. “We might need it later” is not a valid reason.
- Delegating compliance to the school. If you are the platform provider, you are likely a joint Data Fiduciary or processor with independent obligations.
Cross-framework mapping
- ISO 27001:2022 — Annex A 5.34 (privacy) and 5.35 (independent review) support EdTech control frameworks. See our ISO 27001 service page.
- COPPA (US) — If the platform serves US students, COPPA overlaps with DPDP but is not identical. Dual compliance may be required.
- CERT-In Directions — Security safeguards for children’s data are scrutinised more heavily in incident response. See our CERT-In runbook.
Implementation patterns by EdTech sub-segment
The five-control framework applies uniformly, but the implementation differs sharply by EdTech sub-segment. Below is the calibration we use with clients in the major sub-segments we serve from Bengaluru.
K-12 learning apps and test prep
Highest regulatory exposure under DPDP children’s-data provisions. Every active user is presumed to be under 18 unless verified otherwise. Implementation priorities: parent-account-as-primary-account architecture (parent signs up, adds child as a sub-account); strict prohibition on third-party advertising SDKs in the child experience; behavioural-monitoring features (adaptive learning, personalisation engines) require careful re-engineering because the Act’s prohibition is broad. Bangalore K-12 platforms we have engaged with typically need 12–18 weeks to retrofit the consent architecture, longer if the platform is built on adaptive-learning ML pipelines.
Higher-education and college test prep (NEET, JEE, CAT)
Mixed exposure because users span the 17–18 boundary. A 17-year-old NEET aspirant requires verifiable parental consent; an 18-year-old does not. The architecture must accommodate the transition — when a child user reaches 18 during the platform engagement, the consent regime shifts. Implementation priorities: age-verification at signup with re-verification cadence; consent-state-machine that tracks transition; clear differentiation in the user experience between minor and adult flows.
Professional learning and upskilling
Lower direct exposure because the user base is predominantly adult. The sub-segment risk is supporting platforms that serve corporate customers whose own employee-development programmes include minors (uncommon but not unheard of). Implementation priority: age-affirmation at signup is sufficient for the majority of users; flagged accounts route through enhanced workflow.
School ERP and management systems
Elevated exposure because these platforms typically process pan-school data, including children’s biometric data (Aadhaar-linked attendance), health records, and academic performance. The school is typically the Data Fiduciary; the ERP vendor is the Data Processor. The DPA between school and vendor must allocate DPDP children’s-data obligations explicitly, including parental consent collection (typically school’s responsibility), data retention (vendor’s responsibility), and data deletion on student exit (joint responsibility).
Adaptive learning and AI tutoring
Highest implementation complexity because the core product feature — behavioural profiling to deliver personalised learning paths — is precisely what the Act prohibits for children’s data. Two implementation approaches are emerging: (a) opt-in personalisation with explicit parental consent for the specific use of behavioural data, with the personalisation feature disabled by default; (b) federated-learning architectures that derive personalisation insights without retaining behavioural profiles centrally. Both approaches add 8–16 weeks of engineering work versus a non-compliant architecture.
Social learning features (forums, peer groups, mentorship)
Elevated exposure because user-to-user interactions create new data flows. Implementation priorities: moderation and safety frameworks specific to children; restriction or removal of public-facing user-generated content; peer-matching algorithms that do not rely on prohibited behavioural profiling.
Verifiable parental consent — implementation models
The “verifiable” requirement is the operationally hardest part of the Act for EdTech platforms. Several implementation models are emerging in the Indian market.
Model A — Aadhaar-based parental verification via DigiLocker
The most-secure model. Parent verifies identity via Aadhaar through DigiLocker; the verification token is recorded in the consent record. Strong audit trail; compatible with future Consent Manager frameworks. Operationally complex to implement; requires Aadhaar Data Vault hosting and DigiLocker partnership.
Model B — Credit-card pre-authorisation
Parent authorises a small (₹1) charge on a credit card; the assumption is that the cardholder is an adult parent. Operationally simple but raises payment-friction issues; not all parents have credit cards.
Model C — Document upload with manual verification
Parent uploads a government-issued ID; the platform verifies (manually or via automated KYC service). High-friction but high-confidence. Common interim solution while Aadhaar-based models mature.
Model D — Email + phone verification with declared parent status
Parent provides email and phone; the platform verifies both via OTP and declared status. Lowest-friction but weakest evidence; may not satisfy the Board’s “verifiable” expectation. Recommended only as a stepping stone.
The emerging consensus among Bangalore EdTech compliance teams is that Model A (Aadhaar-DigiLocker) will become the dominant approach over the next 18–24 months, with Model C as the interim implementation for most platforms.
Common EdTech DPDP failures we have seen
Beyond the five general mistakes listed earlier, three failures specifically recur in Bangalore EdTech engagements.
Treating school staff as Data Principals rather than authorised users. Teachers and school administrators access children’s data on behalf of the school; they are not the Data Principal whose data is being processed. Confusing the two creates incorrect consent flows and access-control architectures.
Building DPDP children’s-data controls only into the consumer app, not the school ERP. Many EdTech platforms have multiple products; controls applied to one product but not others create compliance gaps where children’s data flows between products.
Using third-party analytics SDKs that profile children. Google Analytics for children’s apps, Facebook SDK for parental-marketing flows, and similar third-party SDKs typically violate the Act’s prohibition on tracking and behavioural monitoring of children. The DPA with the SDK provider rarely includes children’s-data clauses, leaving the platform exposed.
Practical next steps
If you are building or re-engineering a consent flow, start with the parental consent workflow checklist above. If you need a full DPDP implementation roadmap, see our DPDP Act Complete Guide. If you want a downloadable checklist, see our DPDP Compliance Checklist.
For organisations that want a thirty-minute scoping conversation with a partner, the contact form in the site footer books the call directly. We commit to written scope, fixed price in INR, and direct partner-level accountability through the engagement.
EdTech DPDP FAQ
Does the children’s-data restriction apply to all EdTech? It applies whenever the platform processes data of users under 18. Higher-education and professional-learning platforms with predominantly adult users have lighter obligations than K–12 platforms.
Can I rely on parental email confirmation as verifiable consent? Email confirmation alone is generally not sufficient as “verifiable” consent. The Board has indicated that stronger evidence — Aadhaar-based verification, government-ID upload, or credit-card pre-authorisation — is closer to the regulatory expectation.
What if a user lies about their age during signup? Reasonable due diligence is the standard. If the platform has implemented age-verification mechanisms appropriate to the risk, occasional false declarations do not breach the obligation. Pattern-based misrepresentation does.
Can adaptive learning be made compliant for children? Yes, with care. Two approaches: (a) opt-in personalisation with explicit parental consent for the specific use of behavioural data; (b) federated-learning architectures that derive insights without retaining behavioural profiles centrally.
Does the Act prohibit all advertising in children’s apps? It prohibits targeted advertising and behavioural advertising directed at children. Contextual advertising (based on the content being viewed, not the user’s behaviour) is generally permissible but is itself increasingly questioned.
What about gamification features like leaderboards? Leaderboards involve processing children’s performance data and displaying it to other children. Most implementations require parental consent and careful design to avoid behavioural-monitoring concerns.
Can a parent revoke consent on behalf of a child? Yes, and the platform must honour the revocation. Revocation triggers data deletion or anonymisation depending on the data type.
What happens when the child reaches 18? The child becomes an adult Data Principal with direct rights. Implementation patterns: automatic transition with notification to the user; consent-renewal at 18 prompting the user to review processing activities; account-state migration from “child account” to “adult account” with appropriate flags.
Does the platform need to record parental identity? Yes, to support audit. The consent record should include parent’s name, contact, verification method used, timestamp, and IP address.
Can schools provide consent on behalf of parents? Generally no. Schools acting in loco parentis for educational purposes can provide certain consents within their authority but cannot consent to broader data processing without parental authorisation. The DPA between school and EdTech platform should clarify this.
What about EdTech platforms serving multiple jurisdictions? Multi-jurisdiction children’s data programmes typically apply the strictest applicable standard. DPDP, COPPA (US), and the UK Children’s Code create different obligations; platforms operating in all three apply the highest-common-denominator.
How is the Act enforced for children’s-data violations? The Board has indicated children’s-data enforcement is an early priority. Industry watchers expect first material enforcement actions in this category over the next 12–18 months.
Building a children’s-data programme in 12 weeks
For Bangalore EdTech platforms starting from a non-compliant baseline, a structured 12-week programme produces operational compliance.
Weeks 1–2 — current-state assessment. Audit existing data flows. Identify all touchpoints where children’s data is processed: registration, learning content, social features, advertising integrations, analytics, parental dashboard. Document data classification (which data is “children’s data” requiring heightened protection vs general data).
Weeks 3–4 — consent architecture design. Design parent-as-primary-account model. Specify verifiable parental consent mechanism (typically Aadhaar-DigiLocker for Indian platforms). Design consent revocation and child-account-deletion workflows.
Weeks 5–8 — implementation. Engineering team builds the consent architecture. Removes prohibited tracking SDKs and behavioural-monitoring features. Implements age-verification at signup. Updates privacy notices in English plus regional languages.
Weeks 9–10 — testing and validation. Test consent flows with sample parent-child accounts. Validate prohibited features are inactive. Audit DPAs with all third-party processors for children’s-data clauses.
Weeks 11–12 — documentation and rollout. Update compliance documentation. Train customer-support team on Data Principal rights workflow. Brief board on programme completion.
This 12-week programme is the floor; platforms with substantial adaptive-learning or behavioural-personalisation features may require an additional 8–12 weeks for re-engineering those components.
Common technical patterns in EdTech compliance
Pattern 1 — parent-account-as-primary-account. Parent registers; child added as sub-account. Parental consent captured at parent-account creation; flows through to child-account activities. Most operationally clean architecture.
Pattern 2 — child-account with parental dashboard. Child registers (with school authorisation if school-mediated); parent accesses dashboard to review and consent to specific features. Higher operational complexity but supports cases where the school is the primary relationship.
Pattern 3 — school-as-Data-Fiduciary, platform-as-Processor. School is the Data Fiduciary; platform is the Data Processor. The school collects parental consent; the platform implements technical controls. Common for B2B EdTech sold through schools rather than direct-to-consumer.
Pattern 4 — opt-in personalisation. Default behaviour is non-personalised; parents explicitly opt in to personalisation features with detailed disclosure of what behavioural data is processed and why. Compliant but produces lower-personalisation default experience.
Pattern 5 — federated learning. Personalisation insights derived without retaining behavioural profiles centrally. Most-compliant architecture but operationally complex; appropriate for platforms with substantial engineering investment.
Cross-jurisdictional considerations for Indian EdTech
Bangalore EdTech platforms operating across multiple jurisdictions face additional considerations beyond DPDP children’s-data obligations.
COPPA (US). Platforms with US student users face Children’s Online Privacy Protection Act obligations that overlap with DPDP children’s-data provisions. COPPA’s age threshold is 13 (vs DPDP’s 18); platforms operating in both apply DPDP’s stricter threshold universally.
UK Age Appropriate Design Code. UK-resident users add the Children’s Code expectations covering age-appropriate design, default privacy settings, and content controls. Operational alignment with DPDP requires modest additional configuration.
EU GDPR Article 8. EU member states can lower digital-consent age to 13 (or higher per state). Multi-EU-jurisdiction platforms apply the strictest applicable threshold.
Australia Privacy Principles. Australian users add APP obligations, including specific provisions for children’s data. APP-DPDP overlap is substantial.
For Bangalore EdTech with global ambitions, designing for the strictest applicable standard (typically DPDP’s 18-threshold) reduces multi-regime architectural complexity.
Future regulatory trajectory for children’s data in India
The Data Protection Board has indicated children’s-data enforcement is an early priority. Specific trajectory worth monitoring:
Verifiable consent mechanism standardisation. Aadhaar-DigiLocker is emerging as the dominant Indian implementation; the Board may issue specific guidance favouring this approach.
SDF designation for major EdTech platforms. Large EdTech platforms with substantial children’s-user populations are likely candidates for SDF designation, triggering additional DPO, DPIA, and audit obligations.
Sectoral coordination with MoE. The Ministry of Education’s interest in EdTech regulation may produce sector-specific guidance complementing DPDP.
International coordination. India’s data-protection cooperation with EU, UK, and US may produce specific bilateral arrangements affecting cross-border children’s-data flows.
Algorithmic transparency. Adaptive-learning and AI-tutoring features may face specific transparency obligations beyond current DPDP requirements.
Bangalore EdTech platforms should treat current DPDP children’s-data implementation as the floor and build toward stricter expectations.
EdTech-specific compliance team and budget
For Bangalore EdTech founders planning DPDP compliance investment, sector-specific considerations affect team and budget design.
Compliance lead. Even pre-Series-A EdTech platforms with children’s data benefit from a designated compliance lead — often a part-time senior engineer or external advisor at this stage. The role formalises ownership of children’s-data programme.
Privacy engineering capability. Children’s-data programme requires engineering investment; designated privacy engineering capability (one or two senior engineers focused on privacy features) is operationally valuable.
Legal counsel with EdTech expertise. EdTech-specific legal counsel familiar with DPDP children’s-data provisions, COPPA (if US students), school-vendor contract structures. Generic SaaS legal counsel rarely has this depth.
Customer-support privacy training. Customer-support teams handle parental consent issues, data deletion requests, and grievance escalations. Training the support team on DPDP procedures is operationally important.
Audit and compliance budget. Annual budget allocation for DPDP audit, framework certifications (ISO 27001 increasingly expected by enterprise EdTech buyers), penetration testing of consumer apps, and third-party assurance work.
For mid-stage Bangalore EdTech (Series A-B), total privacy-and-security investment typically ranges ₹40-80 lakh annually. Lower investment produces compliance gaps; higher investment is rare at this stage.
Children’s-data compliance as a competitive differentiator
For Indian EdTech platforms, children’s-data compliance is increasingly a competitive differentiator.
Parent trust signals. Parents evaluating EdTech platforms increasingly consider data-protection commitments. Privacy-prominent platforms produce better trust signals than competitors who hide privacy disclosures.
School procurement consideration. Schools selecting EdTech platforms for institutional adoption increasingly conduct privacy due diligence. Strong compliance posture wins procurement.
Regulatory landscape advantage. Platforms with strong children’s-data compliance position favourably as regulator scrutiny intensifies. Reactive compliance produces worse outcomes than proactive.
International expansion enabler. Strong DPDP children’s-data compliance translates well to COPPA, UK Children’s Code, and other international regimes. Platforms with strong base posture expand internationally faster.
The strategic posture for Bangalore EdTech is to treat children’s-data compliance as a competitive moat rather than a regulatory cost.