Rethinking Social Media for Children, A Global Reckoning with the Digital Dependency Crisis
The tragic news from Ghaziabad sent a shudder through the nation. Three minor sisters, aged between 13 and 15, ended their lives by jumping from the balcony of their ninth-floor flat. The precipitating event, according to preliminary police reports, was their parents’ decision to restrict their mobile phone usage. The trigger was not a moment of teenage rebellion over a lost privilege, but a deep-seated, pathological addiction to online gaming that made the withdrawal unbearable. Experts called it a “fatal outcome of a digital dependency,” a stark and horrifying illustration of a crisis that is silently engulfing a generation.
This tragedy is not an isolated incident. It is the most extreme point on a spectrum of harm that is becoming increasingly visible to parents, educators, and policymakers around the world. The question of who is responsible for protecting children in the age of smartphones is no longer a matter of abstract debate. It is an urgent, pressing crisis. As governments from Australia to Europe enact bans and restrictions, and as courts in the United States prepare to hold tech companies accountable for their “defective” platform designs, the world is finally beginning to confront the difficult truth: we have conducted a decades-long, uncontrolled experiment on our children, and the results are devastating.
The data is overwhelming and deeply troubling. Global estimates suggest that about 25.89% of adolescents suffer from internet addiction, with 40% experiencing addictive use of social media. A 2022 WHO study across 44 countries in Europe, Central Asia, and Canada found that 11% of adolescents exhibit signs of problematic social media use, with girls (13%) significantly more affected than boys (9%). The same study found that 12% of adolescents are at risk of troublesome gaming, a problem that skews heavily towards boys (16%) compared to girls (7%). A Pew Research Centre study in 2025 revealed the sheer scale of immersion: 36% of US teens use platforms like YouTube, TikTok, Instagram, Snapchat, and Facebook “almost constantly.” Around 64% use AI chatbots, with 30% doing so daily. These are not occasional users; they are digital natives whose waking hours are saturated with algorithmically curated content designed to maximize engagement, often at the expense of their well-being.
The consequences are manifesting in a cascade of mental and physical health crises. Jason Nagata, an associate professor of paediatrics at the University of California, has warned of the detrimental impact of social media exposure on sleep patterns, future depression, and weight gain in adolescents. One recent study found that even low levels of social media use—around an hour per day—in children under 13 were associated with poorer cognitive outcomes. Globally, approximately one in seven (nearly 15%) of 10 to 19-year-olds experience mental health disorders, accounting for 13-15% of the total global burden of disease in that age group, with anxiety and depression being the most common, according to WHO and UNICEF data. In India, nearly one in four adolescents reports symptoms consistent with anxiety or depression, while the National Crime Records Bureau (NCRB) has documented a steady and alarming rise in student suicides over the past decade. The teenagers themselves are aware of this. A 2024 Pew poll revealed that many teenagers have become uneasy about the time they spend online, with girls in particular reporting that apps are affecting their self-confidence, sleep patterns, and overall mental health.
Tech companies, for their part, point to the safeguards they have implemented. They cite automated content filtering, advanced parental supervision tools, and age-verification mechanisms. But experts from Common Sense Media and Stanford Medicine have repeatedly found these safeguards insufficient for minors. The European Commission recently concluded that TikTok’s “addictive design” features effectively bypass the company’s own safety efforts. The very architecture of these platforms—the infinite scroll, the push notifications, the algorithmically curated feeds—is engineered to maximize time spent on site. As many regulators and advocacy groups argue, the only real solution is “Safety by Design”—building platforms that are safe for young users from the ground up, rather than attempting to bolt on ineffective safeguards after the fact.
The push for accountability is now moving into the legal arena in a way that could prove as transformative as the war against Big Tobacco in the 1980s. In a Los Angeles superior court, an ongoing “bellwether” trial, K.G.M. v. Meta et al., is underway. It is the first representative “test case” against social media companies for their “defective platform design products.” The plaintiff is a 20-year-old Californian woman who started using YouTube at age 6 and Instagram at age 9, and now suffers from anxiety, depression, and body dysmorphia. Her legal team is arguing that the platforms should be held liable under product liability standards, comparing their designs to “dopamine-seeking ‘slot machines'” and “digital casinos.” For the first time, Meta CEO Mark Zuckerberg took the stand to defend his company’s safety efforts, while also acknowledging that Meta’s goal is to maximize user time. He deflected responsibility for age verification onto parents, calling for better parental controls at the device level. The case has drawn widespread attention because, if successful, it could fundamentally shift the legal landscape, holding tech companies accountable for the harm their products cause in the same way tobacco companies were eventually held accountable for the harms of smoking.
Parents, meanwhile, are caught in the middle, struggling to navigate a new frontier for which there is no historical playbook. A 2025 Pew Research Centre survey of US parents revealed that a majority feel that parenting today means making tough choices about technology. Eight in ten said that the harms of social media outweigh the benefits. Two-thirds of parents (67%) said tech companies should do more to set rules around what kids can do or see online, and 55% said that lawmakers should do more. The American Academy of Pediatrics’ updated screen time guidelines, issued in 2016, place a significant burden on parents and caregivers to establish rules that fit their lives and families. The WHO’s 2019 guidelines were even more stark, recommending no screen time at all for children under one year old, and limiting it to one hour or less daily for those under five. Early childhood is a period of rapid brain development, and family lifestyle patterns established then have lifelong consequences.
In response to this crisis, governments are beginning to act with unprecedented boldness. Australia has become the first country in the world to impose a social media ban for children under 16, targeting major platforms with massive, non-negotiable fines for non-compliance. Denmark, France, and Greece are set to follow suit with similar legislation. In the United States, the federal government remains at a crossroads, with many state-level laws facing First Amendment challenges. In India, the Economic Survey of India (2025-26) has called for age-based limits on social media usage, and two states, Andhra Pradesh and Goa, are considering such legislation.
Critics, however, point out that bans are technically difficult to implement. They often lead to a mass migration to Virtual Private Networks (VPNs) and the darker, less regulated corners of the web, where children may be exposed to even greater dangers. In India, the Digital Personal Data Protection Act, 2023, has been criticized for its “consent gating” provisions, which may result in either false declarations by minors trying to bypass restrictions, or the effective exclusion of children from valuable online resources.
There are no easy answers. The challenge of raising children in a world saturated with addictive technology is one that no previous generation has faced. But the Ghaziabad tragedy is a stark reminder that doing nothing is not an option. A multi-pronged approach is needed: stronger regulation of platform design to make safety a core feature, not an afterthought; better enforcement of age-verification and data protection laws; and, most importantly, a societal conversation that leads to a “digital wellness plan” for every family. This plan must be built on encouragement, engagement, and empowerment, helping children develop a healthy and balanced relationship with digital technologies that supports their overall well-being. The alternative—a generation lost to anxiety, depression, and, in the worst cases, fatal outcomes of digital dependency—is unthinkable.
Questions and Answers
Q1: What tragic incident prompted the article, and what does it illustrate about the dangers of digital dependency?
A1: The article begins with the suicide of three minor sisters in Ghaziabad after their parents restricted their mobile phone usage. The incident was linked to severe online gaming addiction. It illustrates a “fatal outcome of a digital dependency” and highlights the extreme, life-threatening consequences of unchecked screen time and addiction among children.
Q2: What are the key global statistics on adolescent mental health and social media use?
A2: The statistics are alarming. A WHO study found 11% of adolescents show signs of problematic social media use. Globally, 1 in 7 (15%) of 10-19-year-olds experience mental health disorders. A 2025 Pew study found 36% of US teens use platforms “almost constantly.” In India, nearly one in four adolescents reports symptoms of anxiety or depression, and student suicides are rising.
Q3: What is the significance of the ongoing “bellwether” trial K.G.M. v. Meta et al. in Los Angeles?
A3: This is the first representative “test case” against social media companies for their “defective platform design products.” The plaintiff argues that platforms like YouTube and Instagram are like “digital casinos” engineered for addiction. Mark Zuckerberg testified in his own defence. The case could hold tech companies liable for harm caused by their products, similar to the lawsuits against Big Tobacco in the 1980s.
Q4: What legislative actions are being taken globally and in India to address this crisis?
A4: Australia has imposed a social media ban for children under 16, with heavy fines for non-compliance. Denmark, France, and Greece are set to follow. In the US, federal action is pending. In India, the Economic Survey 2025-26 has called for age-based limits, and states like Andhra Pradesh and Goa are considering legislation. However, critics warn that bans are technically difficult to enforce and may push children to unregulated parts of the web via VPNs.
Q5: What is a “digital wellness plan,” and why is it needed?
A5: A “digital wellness plan” is a family-level strategy built on encouragement, engagement, and empowerment to help children develop a healthy and balanced relationship with technology. It is needed because “cyber parenting” is a new reality, and the WHO recommends strict screen time limits (none for under-1s, less than an hour for under-5s). Such a plan would support children’s overall well-being and help prevent the tragic outcomes of digital dependency.
