Press "Enter" to skip to content

Denmark To Ban Teens under 15 From Using Social Media

New Legislation Aims to Protect Young People from Harmful Digital Content

Denmark is moving forward with groundbreaking legislation that would ban social media access for children under 15 years old, making it one of the most aggressive European nations in regulating children’s online activity. The proposal, announced in November 2025, represents a significant escalation in efforts to protect young people from what Danish Prime Minister Mette Frederiksen described as technology platforms that are “stealing our children’s childhood.”

The Danish government secured a broad political agreement among three governing coalition parties and two opposition groups in parliament, signaling strong cross-party support for the measure. While the proposal requires formal legislation and parliamentary votes before becoming binding law, officials expect it could take effect by mid-2026. The proposed measure would establish 15 as the minimum age for social media access, though it includes provisions allowing parents to grant permission for children as young as 13 to use these platforms after completing a specific assessment process.

Speaking at the opening of the Danish parliament in October 2025, Prime Minister Frederiksen articulated the government’s concerns with stark clarity. “We have said yes to mobile phones in our children’s lives in the best sense, so they can call home and communicate with their friends,” she explained. “But the reality is that we have let a monster loose. Never before have so many children and young people suffered from anxiety and depression.”

The Scope and Motivation Behind the Ban

The Danish initiative comes amid mounting evidence that existing age restrictions on social media platforms have proven largely ineffective. Despite current rules that prohibit children under 13 from creating accounts, Danish authorities report that approximately 98 percent of Danish children under age 13 have profiles on at least one social media platform, and nearly half of those under 10 years old maintain active accounts. These statistics reveal a significant gap between stated platform policies and actual usage patterns.

Caroline Stage, Denmark’s Minister for Digital Affairs, has been leading the legislative push. In interviews following the announcement, she emphasized that the government’s frustration stems from years of inadequate self-regulation by technology companies. “In far too many years, we have given the social media platforms free play in the playing rooms of our children. There’s been no limits,” Stage told reporters. She noted that 94 percent of Danish children under 13 have social media profiles, a situation she characterized as unacceptable given the documented harms associated with early social media exposure.

The Ministry of Digitalization’s official statement framed the issue in terms of protecting vulnerable young people from forces beyond their control. “Children and young people have their sleep disrupted, lose their peace and concentration, and experience increasing pressure from digital relationships where adults are not always present,” the ministry explained. “This is a development that no parent, teacher or educator can stop alone.” The statement emphasized that commercial interests and harmful content have become too dominant in shaping children’s everyday lives and childhoods.

Prime Minister Frederiksen cited additional concerning statistics during her parliamentary address, noting that 60 percent of boys aged 11 to 19 don’t physically see a single friend in their free time over the course of a week. This social isolation, coupled with declining reading skills and concentration abilities among young people, has driven the government’s determination to act decisively.

Enforcement Mechanisms and Technical Solutions

One of the most challenging aspects of any age-based social media restriction involves enforcement. Critics of similar proposals in other countries have questioned how governments can effectively prevent children from accessing platforms when millions have easy access to screens and when many are technologically savvy enough to circumvent basic restrictions.

Denmark is addressing this challenge through a multipronged approach. The government has announced plans to launch a new “digital evidence” app in spring 2026 that will display age certificates to ensure users comply with social media age limits. The app will likely leverage Denmark’s existing national electronic identification system, which nearly all Danish citizens over age 13 already possess. Several other European Union countries are currently testing similar age-verification applications.

Minister Stage acknowledged that Denmark cannot force technology giants to use the Danish app specifically, but emphasized that the government can mandate proper age verification systems. “What we can do is force the tech giants to make proper age verification, and if they don’t, we will be able to enforce through the EU commission and make sure that they will be fined up to 6 percent of their global income,” she explained. This enforcement mechanism leverages the broader regulatory framework established by European Union law to ensure compliance.

Stage offered a straightforward analogy to explain the government’s approach to enforcement. “When we go into the city at night, there are bouncers who are checking the age of young people to make sure that no one underage gets into a party that they’re not supposed to be in,” she said. The government expects social media platforms to implement similarly robust age-checking systems, moving beyond the easily circumvented self-reporting methods currently in place.

The minister also emphasized that the government would proceed carefully to ensure the regulation is effective and free of loopholes. “I can assure you that Denmark will hurry, but we won’t do it too quickly because we need to make sure that the regulation is right and that there is no loopholes for the tech giants to go through,” Stage stated. This measured approach reflects awareness of the complexity involved in regulating global technology platforms.

Context Within European and Global Movements

Denmark’s proposal does not exist in isolation but rather represents part of a growing international movement to limit children’s exposure to social media. The most prominent precedent came in December 2024 when Australia enacted the world’s first comprehensive social media ban for children, setting the minimum age at 16. That law, which took effect in December 2025, subjects platforms including TikTok, Facebook, Snapchat, Reddit, X (formerly Twitter), Instagram, and YouTube to fines of up to 50 million Australian dollars (approximately 33 million US dollars) for systematic failures to prevent children younger than 16 from holding accounts.

Within Europe, several other nations are exploring similar restrictions. Norway’s government launched a public consultation in June 2025 for proposed legislation that would ban social media for users under 15 years of age. France has gone even further, with a parliamentary commission recommending not only a ban for those under 15 but also proposing a “digital curfew” for teenagers aged 15 to 18 that would restrict social media access between 10 p.m. and 6 a.m. The French commission was launched in March 2025 after families filed lawsuits against TikTok alleging that the platform’s algorithms exposed their children to content linked to suicide attempts.

The Netherlands issued national guidelines in 2025 advising that children should avoid social media before age 15, though these recommendations lack the binding force of formal legislation. Beyond Europe, Malaysia announced plans to ban social media accounts for people under 16 starting in early 2026. This global trend suggests that Denmark’s initiative, while bold, reflects a broader reevaluation of how societies should balance technological access with child protection.

Denmark’s proposal would make it the most restrictive European Union member state regarding children’s social media access, exceeding even the baseline protections mandated by EU-wide legislation. The European Union’s Digital Services Act, which became fully effective in 2024, already forbids children younger than 13 from holding accounts on social media platforms like TikTok and Instagram, video-sharing platforms like YouTube and Twitch, and sites like Reddit and Discord. However, as Danish statistics demonstrate, these existing rules have proven inadequate in practice, leading Denmark to pursue stricter national standards.

Platform Responses and Industry Pushback

Social media companies have responded to Denmark’s proposal with a mixture of acknowledgment and skepticism. TikTok, in a statement following the announcement, said it recognizes the importance of Denmark’s initiative. The company pointed to its existing safety measures, noting that it has created “more than 50 preset safety features for teen accounts, as well as age appropriate experiences and tools for guardians such as Family Pairing,” which allows parents and teens to customize safety settings together.

Several platforms engaged in lobbying efforts against the ban prior to its announcement. TikTok ran advertisements touting its potential as an educational tool, while other companies argued they already take substantial steps to protect children from harmful content. These platforms contend that age verification technology is improving and that their internal moderation systems have become more sophisticated.

Platform representatives have also questioned whether blanket age bans represent the most effective approach to protecting children online. Meta Platforms, the parent company of Facebook and Instagram, emphasizes that it uses video selfie analysis and artificial intelligence to help determine user ages. TikTok similarly employs selfie-based age verification that analyzes facial features to estimate whether users meet minimum age requirements. Both companies argue that these technological solutions, combined with improved content moderation and parental controls, could address concerns without imposing strict age limits.

However, Minister Stage dismissed these arguments as insufficient. “One thing is what they’re saying and another thing is what they’re doing or not doing,” she stated in reference to social media platforms. “And that’s why we have to do something politically.” Her ministry’s statement was even more direct, asserting that pressure from tech giants’ business models has become “too massive” and that companies have had “free rein” in children’s digital spaces for too long.

Academic and Expert Perspectives

While the Danish proposal enjoys broad political support, some academics and child development experts have raised concerns about potential unintended consequences. Anne Mette Thorhauge, an associate professor at the University of Copenhagen, has questioned whether age-based bans adequately consider children’s rights. “To me, the greatest challenge is actually the democratic rights of these children. I think it’s sad that it’s not taken more into consideration,” Thorhauge told reporters.

Critics of age-based bans argue that such restrictions may infringe on children’s and teenagers’ rights to access information, connect with peers, and participate in digital civic life. They also express skepticism about enforcement effectiveness, suggesting that determined young people will find ways to circumvent age verification systems, potentially driving them toward less regulated corners of the internet where risks may be even greater.

Supporters of the ban counter that these concerns, while valid, do not outweigh the documented harms of unrestricted social media access for young children. They point to growing evidence linking early social media use to increased rates of anxiety, depression, sleep disruption, attention problems, and exposure to cyberbullying and inappropriate content. From this perspective, restricting access represents a necessary public health intervention comparable to age limits on alcohol, tobacco, or other potentially harmful substances.

Minister Stage attempted to address concerns about overly restricting children’s digital access by emphasizing that the legislation targets specific platforms rather than digital technology broadly. “This is not about excluding children from everything digital,” she explained, “but keeping them away from harmful content.” The government’s position is that children can benefit from digital technology for education, communication, and entertainment without necessarily having unrestricted access to algorithmically driven social media platforms designed to maximize user engagement.

Previous Danish Measures and Growing Momentum

The social media age restriction proposal builds on previous actions Denmark has taken to limit children’s potentially harmful technology exposure. In September 2025, Danish lawmakers voted to ban cellphones from primary schools and after-school programs, a move recommended by a wellbeing commission established by Prime Minister Frederiksen in 2023. This ban reflects growing concerns that constant smartphone access disrupts learning, reduces face-to-face social interaction, and contributes to attention problems among students.

Even before these recent initiatives, Denmark demonstrated caution regarding certain technology platforms in sensitive contexts. In 2023, the Danish Ministry of Defence banned TikTok on work devices, citing security concerns. While that ban was limited in scope and motivated primarily by data security issues rather than child protection considerations, it established a precedent for the government taking assertive action to regulate technology use when deemed necessary.

Timeline and Next Steps

The path from political agreement to enforceable law will involve several stages. Following the broad political agreement announced in November 2025, the proposal must go through a consultation process that allows various stakeholders—including child advocacy groups, education professionals, parents, technology experts, and civil liberties organizations—to provide input on the specific design of the legislation.

After the consultation period concludes, the Danish parliament will conduct multiple readings of the formal legislation before holding final votes. Minister Stage indicated that if this process proceeds smoothly, the law could take effect by mid-to-late 2026. The exact timeline will depend on how quickly the consultation process can be completed and whether significant modifications to the proposal emerge during parliamentary debate.

The Ministry of Digitalization has not yet specified which social media platforms would fall under the ban, though it indicated the age minimum of 15 would apply to “certain” social media. This ambiguity likely reflects the need to define social media in legal terms that can adapt to the rapidly changing technology landscape. Any definition must be specific enough to provide clear guidance to platforms and users while remaining flexible enough to encompass new applications and services that emerge after the law takes effect.

Broader Implications for Technology Regulation

Denmark’s initiative represents more than a single policy change; it signals a fundamental shift in how democratic societies approach technology regulation. For years, the dominant approach has emphasized industry self-regulation, with governments setting broad standards and expecting companies to implement appropriate safeguards. This model has increasingly come under criticism as evidence accumulates that platforms’ business incentives—which reward maximizing user engagement and advertising revenue—may conflict with user wellbeing, particularly for vulnerable populations like children.

If Denmark’s law proves enforceable and demonstrates measurable benefits for children’s mental health and social development, it could catalyze similar legislation across Europe and beyond. The European Union’s regulatory framework allows member states to adopt national measures that exceed minimum EU-wide standards when justified by legitimate public policy objectives. This flexibility means that Denmark’s experiment, if successful, could be replicated by other countries within the EU’s regulatory architecture.

Conversely, if enforcement proves difficult or if the law produces significant negative unintended consequences, it may discourage other nations from pursuing similar approaches. The next several years will therefore serve as a crucial test case for whether age-based social media restrictions can be effectively implemented in liberal democracies that value both child protection and individual rights.

The Global Conversation About Children and Technology

Denmark’s proposal contributes to an accelerating global conversation about how societies should navigate the relationship between children and digital technology. This conversation extends beyond social media to encompass smartphones in schools, video game time limits, screen time recommendations, and the design of age-appropriate digital experiences.

China, for instance, has implemented strict limits on online gaming time for minors and has required that smartphone manufacturers include mandatory “youth mode” settings that restrict usage. These measures reflect a fundamentally different approach to technology governance than prevails in Western democracies, but they demonstrate that various societies are grappling with similar underlying concerns about technology’s impact on child development.

The global nature of technology platforms means that regulation in one jurisdiction can have ripple effects elsewhere. If Denmark succeeds in pressuring major social media companies to implement robust age verification systems, those same technologies and processes might be deployed in other markets, even absent formal regulatory requirements. Similarly, if Denmark’s law proves enforceable through the threat of substantial fines, it could establish a precedent that emboldens regulators in other countries to pursue similar measures.

Conclusion: A Pioneering Experiment

As Denmark prepares to potentially implement one of the world’s strictest social media age restrictions, it embarks on a pioneering social experiment with high stakes and uncertain outcomes. The government’s political consensus reflects genuine concern about children’s wellbeing in an increasingly digital world, supported by troubling statistics about mental health, social isolation, and sleep disruption among young people.

Whether the proposed ban represents an effective solution to these problems remains to be seen. Implementation challenges, enforcement difficulties, and potential unintended consequences all loom as serious considerations. Technology companies’ responses, the development of age verification systems, and children’s own adaptations to the restrictions will all influence whether the policy achieves its stated goals.

What is clear is that Denmark, like Australia before it and potentially many other countries after it, has decided that the status quo of ineffective self-regulation by technology platforms is no longer acceptable. As Minister Stage Olsen declared, the platforms have had “free rein” for too long, and the time for political intervention has arrived. Whether this intervention succeeds in protecting children while respecting their rights will be among the most closely watched technology policy experiments of the coming years.

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *