0% found this document useful (0 votes)
7 views4 pages

Overview of the Online Safety Act

The Online Safety Act aims to protect children and young people from online harm by making social media companies legally responsible for user safety. It mandates the removal of illegal content, enforcement of age limits, and the implementation of age assurance technologies. Schools are encouraged to adopt a whole school approach to online safety, complementing the Act's provisions to safeguard students effectively.

Uploaded by

Debora Hotmauli
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views4 pages

Overview of the Online Safety Act

The Online Safety Act aims to protect children and young people from online harm by making social media companies legally responsible for user safety. It mandates the removal of illegal content, enforcement of age limits, and the implementation of age assurance technologies. Schools are encouraged to adopt a whole school approach to online safety, complementing the Act's provisions to safeguard students effectively.

Uploaded by

Debora Hotmauli
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Online Safety Act

Introduction
Welcome to your Tes course Online Safety Act.

The course will give an overview of what the Online Safety Act covers and how children and young people will be
protected from potential harms on the internet and social media platforms.

The Online Safety Act aims to protect both children and adults from online harm, however in this course we will
focus on how it protects children and young people.

On completion of this course, you will:


• understand what the Online Safety Act aims to do
• consider the impact social media currently has on children
• understand how the Online Safety Act protects children
• consider how the Act influences a whole school approach to online safety.

What is the Online Safety Act?


The Act will now make social media companies legally responsible for keeping their users safe on their platforms. It
enforces a range of practices to ensure this responsibility is upheld.

Originally called the Online Harms Act, it has roots in the Internet Safety Strategy green paper (2017) and the Online
Harms white paper (2019). The consultation and subsequent white paper covered many topics on internet safety
which the Online Safety Act now addresses. The purpose of both papers was to investigate how Britain could
become ‘the safest place in the world to be online.’

The Online Safety Act aims to fulfil this.

The Act was first published in draft in May 2021 and first introduced to the House of Lords on 18th January 2023. It
has undergone several readings and amendments and received Royal Assent on 26th October 2023.

Broadly, the Act covers how:


• social media companies must protect children, including:
o preventing children from accessing illegal or harmful content
o preventing underage children from using social media platforms.
• social media companies must protect adults
• social media companies will prevent repeat offenders from using their platforms
• Ofcom, as the regulator in charge, will be responsible for checking social media companies are abiding by the
law
• Ofcom can take action against international social media companies that are accessible to UK users if the law
is not followed.

Whilst Tes Global Ltd have made every effort to ensure that the courses and their content have been devised and written by leading experts who have ensured that they reflect best practice in all
aspects, Tes Global Ltd exclude their liability of the consequences of any errors, omission or incorrect statements to the fullest extent permitted by law and Tes Global Ltd make no warranty or
representation as to the accuracy, completeness or fitness for purpose of any statements or other content in the course.

No part of this material may be reproduced or utilised in any form or by any means, electronic or mechanical, including photocopying, recording or by any information storage and retrieval system
without permission in writing by Tes Global Ltd.
Ofcom will be working to ensure that the social media providers have the systems in place to protect their users and
that these systems are working. They will be concerned with any case of systemic failure on the part of the provider,
rather than any individual case.

Prevalence of potential harm online


According to the Online Safety Data Initiative, 80% of children aged 12-15 have had potentially harmful experiences
online.

The NSPCC suggests more than 3,500 of online sex crimes take place against children per month.

In 2020, the NCMEC received almost 22 million reports of child sexual exploitation, which included 65 million images,
videos, or files.

And according to the parental control tool, Bark, who analysed more than 3.4 billion online messages in 2021:
• 85% of teens experienced bullying as a bully, victim, or witness
• 56.4% of teens had conversations about depression
• 74.61% of teens were involved in a self-harm or suicide situation
• 7.66% of teens encountered content about disordered eating.

Taking responsibility
As social media companies will be legally responsible for children’s safety, they will be required to:
• remove illegal content quickly or have measures in place to prevent it in the first place
• prevent children from accessing age-inappropriate content or content that is legal but still has the potential
to cause trauma
• enforce age limits on their social media platforms
• publish risk assessments about the dangers posed to children on large social media platforms
• create accessible ways for children and their caregivers to report issues, problems, or concerns.

Illegal content

The Act sets out a range of content that is now illegal for any child or young person to view online to protect them
from harm.

Social media companies will be required to remove content that shows:


• child sexual abuse
• controlling or coercive behaviour
• extreme sexual violence
• fraud
• hate crime
• illegal immigration and people smuggling
• revenge porn
• sexual exploitation
• terrorism.

Companies must also remove content that:


• promotes self-harm

Whilst Tes Global Ltd have made every effort to ensure that the courses and their content have been devised and written by leading experts who have ensured that they reflect best practice in all
aspects, Tes Global Ltd exclude their liability of the consequences of any errors, omission or incorrect statements to the fullest extent permitted by law and Tes Global Ltd make no warranty or
representation as to the accuracy, completeness or fitness for purpose of any statements or other content in the course.

No part of this material may be reproduced or utilised in any form or by any means, electronic or mechanical, including photocopying, recording or by any information storage and retrieval system
without permission in writing by Tes Global Ltd.
• incites violence
• promotes or facilitates suicide
• sells illegal drugs or weapons.

For the first time ever, self-harm content is classified as illegal content.

Harmful content

The Act sets out a range of content which is not illegal but could still cause significant harm to children and young
people if it were to be viewed. The Act requires social media companies to protect children and young people from
this content.

This includes:
• pornographic content
• content that does not meet a criminal threshold but which promotes, encourages or provides instructions
for suicide, self-harm or eating disorders
• content that depicts or encourages serious violence
• bullying content.

Tackling underage users


A further impact of the Act on children and young people is the tightening of controls around age limits on social
media websites and platforms.

Currently, the age limit for many social media platforms is set at 13 years old. This means anyone under the age of
13 should not be using the platform, however an Ofcom report revealed “33% of parents of 5-7s said their child had
a [social media] profile, and 60% of 8-11s said they had one.”

The Act aims to prevent this by requiring all social media platforms to check the age of their users. This is possible
through age assurance technologies.

Social media companies must report what technology they are using to do this and evidence their policy and
procedures for checking the age of users.

Whole school approach


Question
How do you think the Online Safety Act will affect your teaching practice and your school more widely?

Schools still have a responsibility to safeguard all children and young people from harm. Whilst the Act will make
social media companies responsible for what content their users see, this does not remove any accountability or
duty for school staff to protect children.

The most effective way to prevent online harm is to adopt a whole school approach to online safety. Although the
Online Safety Act will help, education remains a key mechanism for helping children to manage risk online.

All students should receive a consistent message throughout their time at school which encourages a safe
environment to talk openly about and act upon potential harms.

Whilst Tes Global Ltd have made every effort to ensure that the courses and their content have been devised and written by leading experts who have ensured that they reflect best practice in all
aspects, Tes Global Ltd exclude their liability of the consequences of any errors, omission or incorrect statements to the fullest extent permitted by law and Tes Global Ltd make no warranty or
representation as to the accuracy, completeness or fitness for purpose of any statements or other content in the course.

No part of this material may be reproduced or utilised in any form or by any means, electronic or mechanical, including photocopying, recording or by any information storage and retrieval system
without permission in writing by Tes Global Ltd.
Setting a culture and ethos that supports online safety will reassure students that you will take their concerns
seriously and will support them.

Statutory guidance and the Act

The statutory guidance Keeping Children Safe in Education increasingly reflects the importance of schools keeping
children safe online, most recently by ensuring all schools have appropriate filtering and monitoring systems in
place.

The Act will not replace these measures but should instead strengthen the school’s approach. You should remember
it is part of the defence against online harms, not a complete solution.

Remember, taking a whole school approach means online safety is the responsibility of all staff.

You can find more detailed information and practical advice on how to integrate a whole school approach to online
safety in our course Online Safety.

Summary
You have now completed your course Online Safety Act, in which you have learned what the Online Safety Act is and
how its implementation will affect children and young people’s experience of being online.

You have also considered what the implications of the Online Safety Act will have for embedding a whole school
approach to online safety.

There are links to all the legislation and reference materials in the Resources section of this course.

You are now ready to complete the questionnaire. Click Questionnaire to undertake the questions.

Whilst Tes Global Ltd have made every effort to ensure that the courses and their content have been devised and written by leading experts who have ensured that they reflect best practice in all
aspects, Tes Global Ltd exclude their liability of the consequences of any errors, omission or incorrect statements to the fullest extent permitted by law and Tes Global Ltd make no warranty or
representation as to the accuracy, completeness or fitness for purpose of any statements or other content in the course.

No part of this material may be reproduced or utilised in any form or by any means, electronic or mechanical, including photocopying, recording or by any information storage and retrieval system
without permission in writing by Tes Global Ltd.

You might also like