Very dark image of a teenage girl under the covers looking at the glowing screen of her smartphone

Teens, screens and the ban: Has anything really changed?

When Australia moved to restrict social media access for under-16s, it was framed as a decisive step to protect young people from the darker sides of digital life — anxiety, bullying, addictive design and endless comparison.

But speak to teenagers, and a more complicated picture emerges. Because while the rules have shifted, teen behaviour hasn’t stopped. In many cases, it has adapted.

“Everyone’s still online”

For teenagers, social media isn’t just entertainment, it’s social infrastructure. It’s where friendships continue after school. Where group chats replace phone calls. Where trends, humour and identity take shape. So when restrictions are introduced, it doesn’t feel like protection — it feels like exclusion. And practically speaking, many aren’t excluded for long.

Teens are adept at navigating systems, and age verification is no exception. Some use older siblings’ details. Others access shared family accounts. Where verification requires parental confirmation, some parents approve it. What looks like a legal barrier often becomes a household negotiation.

Has screen time dropped?

The central question is whether teen screen time has meaningfully decreased.

So far, there’s little sign of dramatic change. If certain social platforms become harder to access, time often shifts elsewhere — gaming platforms, messaging apps, YouTube or streaming services. The hours may remain similar, even if the platform mix changes. This suggests the debate may be less about total screen time and more about platform design.

Teens themselves often argue that not all screen use is equal. Messaging friends or collaborating in games feels very different from endlessly scrolling algorithm-driven feeds. Yet public discussion frequently treats it as one undifferentiated issue.

The ban hasn’t reduced teens’ desire to connect online. It has changed how directly they do it.

The facial scanning experiment

Beyond Australia, enforcement is becoming more technologically sophisticated. In the UK, platforms including Roblox and PlayStation have introduced facial age-estimation tools. Users record a short selfie video, and AI estimates their age before granting access to certain features.

On paper, the system appears robust, however, in practice familiar loopholes emerge – some teenagers ask their parents to complete the scan, while others rely on older siblings. Although the technology can often estimate age with reasonable accuracy, it cannot determine who ultimately ends up using the account. In other words, even the most sophisticated biometric systems are still grappling with human behaviour.  As companies invest in age-assurance systems to demonstrate compliance, real-world enforcement frequently rests with families.

Parents: supportive but unsure

Many parents support restrictions in principle, and their concerns about mental health, cyberbullying and compulsive platform design are genuine. There is a degree of reassurance in knowing that governments are at least attempting to address the issue, yet implementation is rarely straightforward.

In reality, parents are constantly balancing competing pressures: protecting their child without isolating them socially, weighing privacy concerns against safety, and modelling healthy screen habits in their own behaviour. When “everyone else” at school still appears to be online, enforcing strict rules can feel like singling out your own child. And when a teenager is determined, monitoring and enforcement can quickly become exhausting.

While the legislation is national, the reality of enforcing it happens at home, in everyday conversations around kitchen tables and in living rooms (or through a  closed bedroom door in the case of many teenagers!)

What it means for tech companies

For technology companies, these regulations signal more than new onboarding processes. They signal a shift in accountability. Platforms are now expected to actively prevent underage access, not simply state age limits in their terms of service. That requires investment in AI age detection, redesigned user flows and expanded parental controls. But it also creates risk.

If systems are too strict, they frustrate legitimate users. If too weak, regulators intervene. If reliant on facial estimation, privacy advocates push back. And if teens easily bypass them, the credibility of the framework is questioned.

The bigger question: accountability

The core lesson so far is this: regulation can change access points, but it doesn’t eliminate social instinct. Teenagers want connection, validation and belonging. Screens are simply where those needs are expressed. Restrictions may delay access or reshape behaviour — but they do not remove the underlying drivers.

Jonathan Haidt, author of The Anxious Generation, argues that smartphones and social media have reshaped childhood itself, contributing to rising anxiety and depression. His proposal is not only restriction, but delay — later smartphone access, later social media access, and stronger default protections supported by collective parental norms.

Parents worry, governments legislate, and schools introduce policies, while families negotiate rules at home. Yet the platforms themselves — designed to maximise engagement, engineered for compulsion and optimised for growth — remain largely unchanged. The deeper question is this: when do the companies designing these systems take meaningful responsibility for how they shape young minds?

Age checks and facial scans may satisfy regulators, but accountability extends well beyond access. It reaches into algorithmic amplification, notification loops, recommendation systems and the business models built on attention itself. While teenagers adapt, parents negotiate and governments continue to experiment with regulation, the real test is whether big tech companies are willing to fundamentally rethink how these platforms are designed.

Because the screens are still on, and responsibility cannot rest with teenagers alone.

Beyond the family home

For businesses, this debate extends beyond teenagers. Smartphones are now central to both personal and professional life, and organisations issuing mobile devices face similar questions around safeguards, controls and digital responsibility. From content filtering to device management and usage policies, the way companies deploy mobile technology increasingly reflects their wider values around safety and accountability.

Related articles:

If you’d like to talk through your current setup or get a quote for business mobiles, VoIP or broadband, give Simpatico a call or an email, and we can find the best tariffs and deals for you, with 5* customer service as standard.

Shopping Basket