0% found this document useful (0 votes)
27 views4 pages

Australia’s Online Safety Act 2021 Overview

Uploaded by

Michael
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views4 pages

Australia’s Online Safety Act 2021 Overview

Uploaded by

Michael
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

AUSTRALIA

The Online Safety Act 2021 expands Australia’s protections against


online harm, to keep pace with abusive behaviour and toxic content.
The Online Safety Act 2021 is a new legislation that makes Australia’s
existing laws for online safety more expansive and much stronger.
Australian laws need to keep pace with advances in technology and the
threats we face online from harmful behaviour and toxic content. These
modern times of rapid change and social upheaval call for robust new
laws.
The new powers in the Online Safety Act cement eSafety’s role as a world
leader in online safety. They place Australia at the international forefront
in the fight against online harm.
The Act has significant implications for online service providers because it
makes them more accountable for the online safety of the people who use
their service.
The Act gives eSafety substantial new powers to protect all Australians –
adults now as well as children across most online platforms and forums
where people can experience harm.
There is, for the first time, a clear set of expectations for online service
providers that makes them accountable for the safety of people who use
their services.
The Act also requires industry to develop new codes to regulate illegal and
restricted content. This refers to the most seriously harmful material, such
as videos showing sexual abuse of children or acts of terrorism, through to
content that is inappropriate for children, such as high impact violence
and nudity.
In summary, the Online Safety Act:

 creates a world-first Adult Cyber Abuse Scheme for Australians 18 years and older.
 broadens the Cyberbullying Scheme for children to capture harms that occur on
services other than social media.
 updates the Image-Based Abuse Scheme that allows eSafety to seek the removal of
intimate images or videos shared online without the consent of the person shown.
 gives eSafety new powers to require internet service providers to block access to
material showing abhorrent violent conduct such as terrorist acts.
 gives the existing Online Content Scheme new powers to regulate illegal and
restricted content no matter where it is hosted.
 brings app distribution services and search engines into the remit of the new Online
Content Scheme.
 introduces Basic Online Safety Expectations for online service providers.
 halves the time that online service providers must respond to an eSafety removal
notice, though eSafety can extend the new 24-hour period.
The Act sets out what the Australian Government now expects from online service
providers. It has raised the bar by establishing a wide-range of Basic Online Safety
expectations.

These expectations are designed to help make sure online services are safer for all
Australians to use. They also encourage the tech industry to be more transparent about
their safety features, policies and practices.
The Basic Online Safety Expectations are a broad set of requirements that apply to an
array of services and all online safety issues. They establish a new benchmark for
online service providers to be proactive in how they protect people from abusive
conduct and harmful content online.
eSafety now expects online service providers to take reasonable steps to be safe for
their users. We expect them to minimise bullying, abuse and other harmful activity
and content. We expect them to have clear and easy-to-follow ways for people to
lodge complaints about unacceptable use.
The Minister for Communications, Urban Infrastructure, Cities and the Arts can
determine the expectations for certain online services. eSafety then has the power to
require online service providers to report on how they are meeting any or all the Basic
Online Safety Expectations.
The Basic Online Safety Expectations are backed by new civil penalties for online
service providers that do not meet their reporting obligations.
eSafety will also name online service providers that do not meet the Basic Online
Safety Expectations, as well as publish statements of compliance for those that meet
or exceed expectations.

THE UNITED STATES OF AMERICA

In the United States, social media platforms have largely been left to
make and enforce their own policies, though Washington is weighing new
laws and regulations.

“The big five” tech companies which include Alphabet (Google’s


parent company), Amazon, Apple, Facebook, and Microsoft play a
critical role in today’s marketplace in society at large. As
commentators have pointed out, the technology industry has
largely developed free from any significant federal regulation,
particularly when viewed in contrast to potentially analogous
industries such as radio, television, or telephone services.
Section 230 of the Communications Decency Act (CDA) is a crucial
piece of internet legislation in the United States of America. It provides
certain legal protections to online platforms and service providers that
host or publish content generated by users. These protections shield
platforms from being held liable for the content that users post on their
platforms.

Children's Online Privacy Protection Act (COPPA)


The Children's Online Privacy Protection Act (COPPA) is a U.S. federal law that regulates
the online collection of personal information from children under the age of 13. It was
enacted to provide parents with greater control over the collection and use of their children's
personal information online. COPPA sets rules for websites and online services that are
directed towards children or knowingly collect personal information from children.
COPPA requires operators of websites and online services that are directed toward children
under 13, or that knowingly collect personal information from children under 13, to obtain
verifiable parental consent before collecting, using, or disclosing personal information from
those children. The law also mandates providing clear privacy policies and notices to parents,
and it places limitations on what can be collected from children without parental consent.

The Digital Millennium Copyright Act (DMCA) is a U.S. federal law that addresses
copyright issues related to digital media and the internet. It was enacted to implement two
World Intellectual Property Organization (WIPO) treaties and to provide a legal framework
for copyright holders and online service providers to address copyright infringement in the
digital age.

A summarized overview of the DMCA:

"The DMCA consists of several key provisions, including safe harbour provisions for online
service providers, anti-circumvention rules, and provisions for addressing online copyright
infringement. The safe harbour provisions protect online service providers from liability for
copyright infringement by their users, as long as the providers meet certain conditions, such
as promptly removing infringing content when notified by copyright holders. The anti-
circumvention rules prohibit the circumvention of digital rights management (DRM)
technology that controls access to copyrighted works."

REFERENCES

Online Safety Legislation

[Link]

[Link]
online-privacy-protection-rule

You might also like