The Myth Of Digital Privacy And Children's Online Lives
Children’s growing dependence on digital platforms is raising urgent concerns over privacy, data exploitation, and psychological harm.
Here’s the uncomfortable truth: most of us do not really understand privacy, yet we are convinced we have it. For many of us, that sense of privacy is tied to our phones, which we see as something that belongs entirely to us.
Password protection on devices may create the feeling that the information on the device is only accessible to the owner. The comfort is misplaced. A significant part of personal information is routinely relinquished to apps without the user's active intention.
Most people live under this digital privacy myth because others cannot see their photographs, messages, and searches, they assume those elements are private. They are not. They are constantly being collected, analysed, stored and monetised by the companies whose services they use.
For this reason, the current state of affairs is worrying for adults. For children, the situation is even worse.
Parents often complain that their children spend too much time on their phones. But the problem is no longer just screen time. Teenagers' online lives now extend far beyond social media platforms such as Instagram and Snapchat. They rely on ed-tech platforms for schoolwork, use AI tools for assignments, and increasingly turn to chatbots when they feel lonely, anxious or vulnerable.
What this really means is that children are not merely spending time online. They are leaving behind a detailed psychological and behavioural footprint of themselves.
A teenager may tell an AI tool that they are struggling with bullying, loneliness, body image or anxiety. They may search for answers to questions they are too embarrassed to ask anyone else.
They may spend hours on platforms that learn what captures their attention, what makes them insecure and what keeps them coming back.
The result is a gigantic reservoir of personal data. If technology companies choose to use it, these data points can help them recognise, predict, nudge and shape behaviour, for profit.
A Global Shift in How Courts Think About Children
There has been an evolution of understanding across different jurisdictions that the issue extends from privacy concerns to the issue of harm.
An example of this evolution is the recent ruling made by the Los Angeles County Superior Court in the case of K.G.M. v. Meta et al. One of the ruling’s main arguments was the effect of Social media addiction, from a psychological standpoint, is similar to the effects of the tobacco industry. The ruling awarded Meta et al. 6 million dollars, pointing to the lasting serious body dysmorphia, depression, and suicidal ideation caused by social media. The Guardian published a caricature with the ruling in mind, drawn by Nicola Jennings, of an unsettling caricature of Mark Zuckerberg with dollar signs in his eyes, selling a ziplock bag of Instagram, like a drug peddler, illustrating the grim reality of social media addiction.
In New Mexico v. Meta Platforms, Inc., a jury ordered Meta to pay $375 million in damages for misleading users about the safety of children on its platforms. The court found that Meta failed to adequately protect minors from harmful content and online predators despite publicly claiming strong safety measures, and held it liable for deceptive practices in how its algorithms exposed children to inappropriate material. These cases signal an important shift. Courts are no longer viewing technology platforms as passive middlemen. They are beginning to treat them as actors with a duty of care towards children.
The Scale of the Problem in India
A research study published in BMJ Paediatrics Open in 2025 reports results from 3,624 families in five states in north India and states that more than 60 per cent of children aged 2-5 years spend two to four hours on screens daily. This period is when almost all of their brain development occurs, and makes children highly vulnerable to digital exposure.
The trend in older children is even more concerning. 2026 LocalCircles survey covered all of India and found 49 percent of parents in urban India with children aged 9-17 years claiming that their children spend at least three hours daily on social media and OTT platforms.
The negative impact has only recently started being formally acknowledged. India's Economic Survey 2025–26 has reports linking the smartphone usage trends and sleep disorders, anxiety, reduced attention span, and stress related to academic performance.
Governments are responding. Australia and Indonesia have introduced restrictions on social media access for minors under 16. The United Kingdom is actively debating similar proposals. A House of Commons research briefing noted that 95 per cent of children aged 13 to 15 use social media regularly, and 96 per cent have their own social media profiles. Social media is no longer a marginal part of teenage life. It is the environment in which much of adolescence now unfolds.
An outright ban on social media for minors is often floated as a solution, but it is not a complete one. Children are adept at navigating restrictions through fake accounts, borrowed identities, or VPNs. Regulation cannot rely on prohibition alone. It must focus on design accountability, age-appropriate architecture, and real enforcement of consent standards.
What Indian Law Already Provides
India had no standalone data protection law until 2023. The Information Technology Act, 2000 (IT Act) and rules notified thereunder formed the basis around which the data protection framework revolved. This included the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (Privacy Rules).
A decisive shift came in 2017, when a nine-judge constitutional bench of the Supreme Court in Justice K. S. Puttaswamy (Retd.) v. Union of India affirmed that privacy is a fundamental right under Article 21 of the Constitution. That judgment laid the foundation for what eventually became a comprehensive data protection law.
The Digital Personal Data Protection Act, 2023 builds on that foundation and places special emphasis on children's data. Unlike laws such as the United States' COPPA or the European Union's GDPR, which generally use 13 as the age threshold, the DPDP Act defines a child a anyone below 18.
That distinction matters. Teenagers between 13 and 18 are often treated by technology companies as mature enough to look after themselves online. In reality, they remain deeply vulnerable to manipulation, targeted advertising, addictive design and invasive data collection.
The DPDP Act recognises this. It prohibits tracking, behavioural monitoring and targeted advertising directed at children, and requires parental consent before children's data can be processed.
In principle, India has created one of the strongest legal frameworks for children's digital privacy anywhere in the world. The problem is not the law, what needs to fall through is its complete implementation as the deadline for 13th May, 2027 nears its end.
The Gap Between Law and Reality
In the recent report titled DPDP Compliance in Respect to Children's Data by ASIA (ASIA Report), it was found that 12 out of 14 internet online service providers, drawn from popular social media websites, ed-tech companies, AI companies and state-run educational platforms, either had no age bar for creating an account or set the minimum age at 13 without verifiable parental consent. This deviates from the standard of protection under the DPDP Act and allows the processing, monitoring and dissemination of children's private information in ways that could harm their psychological and physical well-being.
The gap matters because the platforms most frequently used by children are often the ones collecting the most sensitive information about them. A child using an educational app, a chatbot and multiple social media accounts may already have a larger digital footprint than many adults.
India does not lack a legal framework. It lacks urgency. The May 2027 compliance deadline does not mean platforms should wait until then. Early and meaningful action is essential, not just to meet legal requirements, but to build trust and ensure a safer digital environment for children. Safeguards cannot be treated as a last-minute obligation. Child protection must be embedded into both policy and product design, starting now.