There is a growing and substantive body of evidence indicating that unregulated access to social media platforms is causing demonstrable harm to children and young people.

This includes, but is not limited to, adverse impacts on mental health and wellbeing, diminished capacity for real-world social engagement, and increased exposure to malicious content and misinformation.

NASUWT members have identified social media use as a primary contributory factor in the deterioration of pupil behaviour in schools, including increased incidences of hostility between pupils, a decline in empathy and interpersonal respect, and the normalisation of abusive and harmful conduct.

Evidence from members further indicates that children’s sustained use of social media is associated with a reduction in attention span, concentration and capacity for sustained learning, due in part to overexposure to highly stimulating and short-form digital content.

A recent snapshot poll of 300 NASUWT members demonstrated that 88.7% support a statutory ban on social media access for children under the age of 16, with 6% opposed and 5.3% undecided.

In the NASUWT Behaviour in Schools 2025 survey, a consistent theme emerging from respondent commentary was the removal of access to social media for under-16s as a necessary intervention to support improved behaviour, safeguarding and wellbeing in schools.

NASUWT position 

NASUWT believes that greater restrictions on social media access for under-16s are required to address the growing health crisis among children and young people.

We also affirm there is a need to limit access to these services to reduce the barriers to learning that our evidence shows is a result of access to social media at a young age.

We hold the view that the risks of access to these platforms are greater than the benefits that they might offer to some children and young people.

We are therefore calling for the UK Government to introduce primary legislation prohibiting access to social media platforms by children and young people under the age of 16.

We will actively advocate for this legislation through formal representations, policy submissions and public communications.

Such advocacy shall include the demand for legally enforceable and robust age verification and compliance mechanisms, placing clear statutory duties on platform providers to prevent access by under-16s and to mitigate circumvention.

Your frequently asked questions

What does NASUWT mean by social media?

The NASUWT position seeks greater restrictions on online platforms and services that let individuals create, share or interact with content and communicate in a networked way. This is in line with the Australian Government position, which is currently leading the way in this policy area.

This would therefore apply to platforms including Instagram, X, Snapchat, Facebook, YouTube and TikTok.

Exemptions could be sought for those platforms or services that are primarily designed for messaging, gaming or education. Examples would be WhatsApp, YouTube Kids and Google Classroom.

As with any legislation that restricts access to a group of products or services, NASUWT would expect this list to be dynamic, with a Government commitment to monitor developments and add or remove platforms over time.

What about the benefits of social media for connecting young people and offering support networks?

Research shows that for many children and young people under 16, the harms of heavy social media use outweigh the benefits.

Studies like the U.S. Surgeon General’s report from 2023 have linked lots of time online with higher levels of anxiety, depression, body image problems and even trouble sleeping, especially for younger teens.

More than this, our children should not need big tech’s commercial apps to build friendships. Rather than focus on what they might lose by not having access to social media, we need to push for a societal shift where children have space and time to connect with others in person, not online.

Is a social media ban enforceable? What if it drives children underground?

There is growing evidence that enforcing age limits is getting a lot easier. New tools, like digital IDs, parental verification systems, and facial age estimation, make it more realistic to check a user’s age without invading their privacy. For example, the Age Check Certification Scheme in the UK says its facial age technology is more than 98% accurate.

We are already seeing this reflected in policy: several US states have introduced laws that restrict social media access based on age. In the UK, the Information Commissioner’s Office has issued guidance on how to use ‘privacy-preserving’ age assurance methods, which shows how technology can verify age without collecting personal data.

Should we not teach digital literacy instead of restricting access to social media?

The reality is there is a risk that no matter how much digital literacy we teach, it cannot override the fact that teenagers’ brains are still developing. They are still building impulse control and emotional regulation, which makes them more vulnerable to persuasive algorithms and constant social comparison.

The American Psychological Association’s 2023 health advisory highlights this as does the work of the Center for Humane Technology.

Does a social media ban infringe children’s rights and freedoms?

Children also have rights to safety and healthy development. Article 3 of the UN Convention on the Rights of the Child prioritises the child’s best interests. Regulators argue safety can override access in early adolescence.

As a case in point, we already restrict under-16s in other high-risk areas, such as alcohol, driving and gambling. There is a clear argument to be made that social media’s addictive design justifies similar limits.

Should social media platforms fix design issues instead of banning users?

Yes, platforms should fix the design issues. But they have been given multiple opportunities to do this and have shown repeatedly an absolute lack of willingness to do so.

Without firm age limits, the platforms still tend to prioritise engagement over safety.

Even when they do add features like parental controls or time limits, data shows that the results are pretty mixed. These tools do not fully solve the problem, because the real issues - things like social comparison and the way algorithmic feeds work - are still there.

Is there a risk that social media bans might harm vulnerable or isolated youth?

The UK Royal College of Psychiatrists has shown that vulnerability actually increases risk. Groups that would be seen as vulnerable or isolated, such as LGBTI or neurodivergent teens, often face disproportionate harassment and exposure to self-harm or eating disorder content online.

We need to be calling for safer, moderated spaces and better support services, not mainstream social media, to meet their needs for connection.

 



Anonymous feedback

If you require a response from us, please DO NOT use this form. Please use our Contact Us page instead.

In our continued efforts to improve the website, we evaluate all the feedback you leave here because your insight is invaluable to us, but all your comments are processed anonymously and we are unable to respond to them directly.