Introduction
It is rightly said that the future belongs to our children and the digital arena is their new playground. The much-awaited draft on Digital Personal Data Protection Rules, 2025 (“DPDP Rules”) notified by the Ministry of Electronics and Information Technology is a welcoming act as it has emphasises on the protection of minors from heinous crimes and privacy violations in the cyberspace. It has explored the rule of verifiable parental consent before obtaining information from minors and persons with disabilities, to protect them from the adverse consequences of the cyberspace. It is a rule that has seen parents across the nation nodding in approval.
Even though the rules are still in their initial stage and were open for feedback and suggestions until 18th February 2025, before the process of finalising them begin, it is imperative to analyse the impact of this provision.
This blog will analyse the provisions revolving around parental consent in the draft rules released by Ministry of Electronics and Information Technology (“MeITY”), and also analyse the existing legal provisions and case studies on parental consent across global jurisdictions. It will further discuss the potential challenges for tech companies and the government in the implementation and also suggest a way forward.
Understanding the Draft DPDP Rules
Rule 10 of the draft on Digital Personal Data Protection Rules, 2025 mandates obtaining a verifiable consent from a lawful guardian in order to process personal data of a child or of a person with disability. Here, the data fiduciary (any person who alone or in conjunction with other persons determines the purpose and means of processing of personal data) shall adopt appropriate technical and organisational measures to obtain the verifiable parental consent, before the processing of any personal data of a child.
Additionally, in order to ensure that the person identifying himself or herself as the guardian is an adult, the fiduciary must conduct due diligence. Data Fiduciary can refer to either the reliable details of the adult available with it or the voluntarily provided details or a virtual token mapped to the identity and age of the individual. The identity proof must be issued by an entity entrusted by law or the government. There are illustrations and explanations provided in the draft rules to garner a better understanding amongst the professionals and the public.
The Digital Personal Data Protection Act, 2023 (“DPDP Act”) and the rules have mentioned certain exemptions for fiduciaries from seeking parental consent if they are processing the data for certain purposes mentioned in Part A and B of the Fourth Schedule of the rules. The class of fiduciaries who are exempt from obtaining parental consent are as follows:
First A clinical establishment, mental health establishment or healthcare professional
Second An allied healthcare professional
Third An educational institution
Fourth An individual in whose care infants and children in a crèche or child day care centre are entrusted
Fifth Who is engaged by an educational institution, crèche or child care centre for transport of children enrolled with such institution, crèche or centre.
It is imperative to note that these exemptions are not absolute, but have requisite conditions for each class to exercise the exemption. Processing must be restricted to the extent necessary for protection of the child’s health, for educational activities and for interests of safety of children enrolled with such institutions.
Global Perspectives
Australia has been in the news for its plan to implement a minimum age for social media usage, but recently it has gone a step further, by putting a ban on social media for children under 16 years of age. One of the world’s strictest crackdowns on internet usage of children has been observed in Australia. The country will levy heavy fines up to nearly 32 million dollars on companies that do not verify kids’ ages and allows minors to login.
This step by Australia has proliferated debates across the globe, wherein countries like Sweden and UK are also considering the same. In Sweden, gangs recruiting teenagers to commit murders and bombings on social media is par for the course. To curb such menace, Sweden is considering putting a ban.
However, putting a blanket ban comes with several drawbacks as well. Most importantly, it curtails the right to digital access and expression. According to general comment no. 25, children are right holders in the digital arena. It poses great risks as well, as more than 175000 children go online every day for the first time and providing them a safe digital space will be a strenuous task for all the stakeholders involved.
It is the need of the hour to strike the right balance between allowing the positive exposure to digital media and ensuring that minors are not subject to the manifold forms of cybercrime. For starters, the practise of parental consent has been introduced in few other countries, before India joined them on stage.
The United States of America passed the Children’s Online Privacy Protection Act (COPPA), 1998 (“COPPA”) which strictly curtails unfair or deceptive use of children’s personal information on the internet. The law mandates that the operator must take reasonable efforts to send a direct notice to the parent (whose content must be in accordance to S. 312.4) and obtain verifiable parental consent.
In cases when the child’s information is collected, and it would not be shared with any third parties, a parental consent can be taken via the parent’s email-id, wherein the parent can opt out their child from the platform as per their discretion. These methods have created a perfect balance between privacy and practicality, as per the revisions of COPPA, and have worked well.
More robust forms of consent-mechanisms are reserved for instances when the data is shared with third-parties. Those consent methods include: providing a credit card in connection with a transaction, executing consent forms (mail, faxed or scanned and returned to the website operator), or providing a manned toll-free number. The Federal Trade Commission (“FTC”) considers this method as useful and privacy friendly.
In August 2024, TikTok faced a lawsuit by US authorities for neither sending notice to parents nor taking parental consent while collecting and processing information from children below 13 years of age. TikTok is facing penalty of $51,744 per violation per day. It is also mandatory as per S. 312.4 (b) to provide clear details on the web-site of what information is collected by them from children and the usage and disclosure practices of the information. America has been a forerunner in the sphere of protection of children’s online privacy.
On the contrary, Italy has faced implementational challenges in ensuring that children under-14 must seek parental consent before they sign up for social media accounts. Despite the presence of the law, reports suggest that 40 % of 13-year old girls indicate problematic use.
The EU’s General Data Protection Regulation (“GDPR”) has served as a global benchmark for safeguarding privacy rights. It has mandated to conduct a Data Protection Impact Assessment (DPIA) if the operator intends to use children’s personal data, to establish whether or not the data processing would lead to high risk to the child’s rights and freedoms. Article 8 of the GDPR mandates that if an Information Society Services (ISS) targets children, parental consent is required for lawfully processing their personal data.
Challenges in Implementation
According to a survey carried out by Local Circles in India, 95% of teenagers in India are on social media and every 6 out of 10 children aged between 9-17 years spend 3 hours online daily. Due to the vast number of children already having accounts on social media, it would be a strenuous task to track users of that particular age group who require consent, which can lead to the platforms invariably having to verify everyone’s age. The rules under its umbrella only cover situations wherein either the child declares that they are a minor or the parent monitors the usage and comes forward to consent.
There are no safeguards for situations where the child can put wrong information and use the platforms. It leads to a negative consequence that the children go to the unregulated dark web where there are no restrictions, in order to do away with parental consent. It can further lead to mishaps with the children. Many households have shared devices where the children have access to the parents’ accounts and are prone to the dangers of the web through their parents’ account.
Furthermore, there is already existing personal information of children available online for the public to access, which can be sourced from websites, schools, online blogs or posts; which are not covered under the ambit of DPDP Act. Instances of cyberbullying, child pornography can occur using data sourced from publicly available information.
In Tier 2 and tier 3 cities, it is found that 80% of children help their parents navigate online platforms, which highlights the digital illiteracy. It questions the foundation of parental consent wherein the parents are not digitally literate to understand what is happening on their digital devices. For them to be understand and monitor the usage of their child would be an improbable scenario.
Impact on Tech Companies
The compliance burden has multiplied on tech companies after giants such as YouTube had faced fine of $170m in US over children’s privacy violation. Furthermore, for breaching GDPR provisions on consent and letting teenagers set up business accounts, Meta has been fined €405m (£349m).
The debate on responsibility is ongoing between tech companies and governments on whether the parents must be accountable for children’s actions on the web or the tech companies. Looking at the legal trends, the companies are facing exorbitant fines for non-compliance with the responsibility put on them. Additionally, they are expected to keep different consent mechanisms in place for adults, minors and persons with disabilities.
In this scenario, smaller companies like Edtech and emerging social media platforms are placed at a disadvantage as compared to bigger platforms like Meta and Snap which have an established user database and verification mechanisms. These issues are being raised by smaller stakeholders, after the draft rules have been released.
It is going to be a challenging path for both big and small tech companies, in different capacities, to adapt to the compliance mechanisms. However, the DPDP still creates a decent balance between compliance requirements and attracting companies to maintain a business-friendly governance.
Recommendations and Conclusion
The ministry must encourage dialogue between tech companies, policymakers, groups working for child rights, parents, and other stakeholders to ensure that a holistic approach goes behind making the rules. The government must invest in technology (verification tools and flagging tools) that are accessible and economical to ensure a safe cyberspace for children. AI based solutions are gaining traction to verify user age. For example, TikTok uses an AI based age-estimation system called “Yoti” to detect if a child is under 13 and too young to use apps such as Instagram and TikTok.
India has proposed a Token-based age verification model. Here, without revealing the user’s personal details, a token is generated via government approved platforms such as Aadhaar and Digi locker, and then the generated token is used to verify age on digital platforms. This will ensure that a proxy cannot be created, and personal details are safeguarded. However, digital exclusion can pose a challenge, especially for those in the rural areas lacking government ID integration on such platforms.
Digital literacy programs must be conducted to teach pressing skills such as safe online navigation, ensuring privacy by knowing the end use of their data and recognising potential risks.
The DPDP act has been toothless since 2023, and the draft rules are a welcome move. It’s a first move towards creating a robust framework and not the last. To protect the young and impressionable minds, the rules must be given a holistic approach, and strict implementation at all levels must be ensured.