Why Banning Teens from Social Media Won't Fix Body Image Issues
Australia's age-gate experiment misses what young people actually need
The case for banning teenagers from social media feels compelling. We've all seen the damage: filters that reshape faces until the mirror becomes the enemy, comparison culture that makes every beach photo a source of anxiety, algorithms that serve up impossible beauty standards and commentary abuse alongside your morning cereal.
Writer Emily Clarkson—who recently interviewed Helen about naturism on her podcast—captured it perfectly in her Substack article: "Girls are barely able to recognise themselves any more. The mirror is competing with a screen in a battle it can't possibly win."
She's not wrong. However, the question is whether banning teenagers from platforms will actually help.
As naturists—as people who've experienced firsthand how seeing real, diverse bodies transforms shame into acceptance—we have a perspective this debate desperately needs. Because the solution isn't to rinse-and-repeat every previous generation's mistake: trying to protect our young people by hiding information from them.
The research suggests prohibition isn't just ineffective. Indeed, it's exactly backwards.
What the Evidence Actually Shows
This week, British Naturism published research from national surveys about young people's attitudes toward nudity. The headline finding challenges everything we assume about social media's impact, and builds on the 2022 Ipsos poll showing 14% of UK adults already describe themselves as naturists or nudists; young adults (aged 16-24) are more open-minded about nudity than their parents or grandparents.
Not less. More.
They're more willing to engage in social nudity. More positive toward naturism. More comfortable with diverse bodies. This is the generation that grew up with Instagram and Snapchat. The ones we're told is drowning in body dysmorphia and comparison culture. And yet they're more body-positive than older generations.
How?
The Numbers That Change Everything
The research reveals something crucial about what's actually happening.
Yes, 39% of 16-24 year-olds compare their bodies with influencers and models. (That number drops sharply after age 45.) So the harm is real and measurable.
But here's what's missing from the ban-it-all argument: 65% of 18-29 year-olds say that seeing more naked people of all shapes and sizes would improve their body confidence (2021 Naturist Education Foundation poll).
Stop and think about what that means.
The same generation experiencing harm from filtered perfection is telling us the solution. Not fewer images. More diverse ones. Not less social media, but better content.
This aligns with Professor Keon West's research on "Good Nudes and Bad Nudes," which found that naturist exposure and casual stripping reduced anxiety, while sexualised images increased it. The type of content matters enormously.
Young people don't want us to take away their phones, they want us to fix what's on them.
When Solutions Don't Survive Contact with Reality
Australia's world-first ban took effect this week. After the grace period to implement changes, platforms face fines up to AU$49.5m (£25m, €28m, US$32m) for letting underage users through.
The BBC interviewed a 13-year-old who bypassed Snapchat's age verification in under five minutes. She simply used a photo of her mother. "It said thanks for verifying your age." That's the technology that's supposed to protect an entire generation.
Amnesty Tech called the law "an ineffective fix that ignored the rights and realities of younger generations." Experts predict widespread circumvention through VPNs, borrowed devices, parents' or siblings' accounts, etc.
This is what happens when one generation tries to solve another generation's problems without understanding how they think or how the technology actually works.
Three years ago the authors wrote about this same dynamic in naturist photography policies. Traditional venues ban cameras entirely; a top-down rule that feels protective to older members but which alienates younger visitors who can't share experiences in the "pictures or it didn't happen" culture they've grown up in. The venues that thrived? The ones that adopted "respect, not rules" approaches: photography allowed but with explicit consent.
When the ban fails—and it will—what happens to our teenagers then?
The Worst of Both Worlds
Once we accept that teenagers will still access social media under a ban, and they will, the question becomes: what will that access look like?
With a ban, they'll access it:
- Without parental guidance or adult supervision
- Through workarounds in spaces where legitimate educators can't reach them
- Where legitimate body-positive content creators are legally blocked from making youth-accessible content
The harmful content—the filtered influencers, the comparison culture, the impossible standards—doesn't disappear. Indeed, it becomes the only content, because the 65% solution (diverse, real bodies) is blocked while the 39% problem (filtered perfection) stays accessible through workarounds. Helen's Women in Naturism Facebook group, for instance, legally couldn't reach 'underage' users.
Meanwhile, the filtered influencers teens access through their older sibling's phone? Still there.
The Photography Paradox
The British Naturism research found something fascinating about young people and visibility. They're understandably cautious about public nudity in places where they might be photographed. As one respondent noted, "no generation in history has lived under the constant visibility created by social media."
But at the North East Skinny Dip, an event with 1,300 attendees, something unexpected happened. The photography-allowed area attracted more younger people. A couple in their early twenties, described as completely ordinary, actively asked someone to take their photos on their own phone.
What changed? Control.
When young people control the camera—when it's their phone, their choice, their consent—they actively seek photo opportunities. The problem isn't photography or social media or visibility itself. It's lack of agency.
This is the "respect, not rules" principle: young people don't need protection from technology. They need education about how to use it safely, and tools to maintain control.
What Would Actually Address the Harm
The concern is genuine, as it should be because the harm is real. So we need solutions that will actually work. The answer isn't banning access; it's regulating the content.
Platforms should be required to:
- Prominently label digitally altered body images. Like nutrition labels on food, viewers deserve to know when encountering filtered imagery. If influencers reshape their waist or smooth their skin, disclosure must be clear.
- Mandate body diversity in algorithmic recommendations. Platforms can't serve endless streams of one body type. Algorithms must actively surface diverse shapes, sizes, ages, abilities.
- Remove access to cosmetic body-altering filters from under-18s. The puppy ears can stay. Those that slim waists, reshape noses, and smooth skin must be gone.
- Transparently report algorithmic amplification. People deserve to know what content gets pushed to them and why. Social media companies have prioritized conflict to promote engagement.
- Impose heavy fines for promoting harmful content. Not just removing eating disorder content when reported, but penalties for algorithmic amplification of harm to any age group.
These aren't unprecedented demands. The EU's Digital Services Act already requires algorithmic transparency. The UK's Online Safety Act establishes content moderation obligations. GDPR proved that global platforms will comply when regulations have teeth. Content regulation is politically feasible—it just requires will, not technological breakthrough. Platforms like Bluesky already demonstrate alternatives: content labelling, self-classification with moderated oversight, temporal feeds instead of algorithmic manipulation, distinguishing artistic nudity from sexual content.
These solutions address the actual problem: toxic content algorithmically amplified for profit.
What Young People Are Telling Us
Saoirse, 21, as quoted in the British Naturism research, described it clearly: "I think the biggest challenge for our generation is probably social media. In our parents' generation they didn't have the same access to images of idealised bodies."
She understands the problem. She can articulate the harm. This matters, because it means young people aren't naive victims needing parental protection. They're capable of analysis, critical thinking, and proposing solutions.
What they need isn't prohibition—it's education and access to body-positive content that could actually make a difference.
According to the 2025 Membership Marketing Benchmarking Report, organisations most successful with young people share one thing in common: strong digital strategies. Those are the organisations that are thriving.
British Naturism has seen significant membership growth post-COVID, driven largely by social media normalisation. Body-positive campaigns reach thousands through digital platforms. Content creators with thousands of followers are actively combating harmful stereotypes.
Ban teenagers from these platforms, and we lose the most effective tool we have for reaching them with positive messages during the crucial years when body image attitudes crystallise.
Professor Keon West's research has consistently demonstrated the benefits of naturist exposure. His study in Children & Society found that childhood exposure to naturism produces higher body appreciation and better psychological adjustment, while his work on communal naked activity showed it increases body appreciation by reducing social physique anxiety. Miss that window during the teenage years, and the damage becomes much harder to undo.
The Balance We Actually Need
The British Naturism research concludes: "Understanding this balance—enthusiasm for nudity combined with awareness of visibility—is crucial for supporting healthier attitudes toward bodies and self-acceptance in an increasingly connected world."
Balance. Not prohibition. Not unregulated chaos. Balance.
The 39% of young people comparing themselves to influencers are suffering real damage. But the 65% telling us more diverse bodies would help? They're also right.
We can address both. Regulate out the filtered lies. Mandate the diverse truth. Educate teenagers—both online and through schooling—about consent, control, permanence, and critical consumption. Give parents better education on the technology their children instinctively know how to control. Give them better tools to filter out content, not ban the platforms. Hold corporations accountable for algorithmic toxicity.
That's what balance looks like.
Before the UK Makes the Same Mistake
Australia's law took effect this week. UK politicians will be watching. "Ban teenagers from social media" fits neatly on a headline, and simple solutions are politically attractive, even when they don't work.
But we know better.
We know that seeing real bodies—diverse, normal, unfiltered bodies—and supportive comments, not harassing ones, is how shame becomes acceptance.
We know that "respect, not rules" succeeds where top-down control doesn't. History teaches us that prohibition without education always fails.
And we know what the research says: young people are already more open to body diversity than older generations, despite growing up with Instagram and TikTok. They're more willing to engage with naturism, more comfortable with nudity, more accepting of diverse bodies.
They achieved this while navigating filtered influencers, comparison culture and sexualised clothing. Imagine what they could achieve if we actually regulated harmful and sexualised content instead of banning their access to body-positive alternatives.
The concerns about social media's impact are valid. The harm is real and documented. But the solution is to trust young people with better information—the real, diverse, unfiltered truth about what human bodies actually look like.
That's what naturism has always understood. It's what the evidence supports. And it's what young people themselves are asking for.
Perhaps it's time policymakers listened.
If you found this analysis useful, please share it. Evidence-based advocacy depends on people willing to examine claims critically rather than accepting inflammatory headlines at face value.