A pair of bills have advanced in Australia’s Parliament that would give its government additional powers to combat objectionable and harmful material online even though a number of stakeholders have decried the legislation. The “Online Safety Bill 2021” and the “Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021” The House of Representatives sent both measures, as amended, to the Senate which may now consider the bills.
The two bills would broadly expand the eSafety Commissioner’s powers to police online abusive material, update existing laws, shorten the amount of time online platforms and others would have in removing objectionable material on direction from the eSafety Commissioner from 48 to 24 hours, pull more entities into regulation, including search engines like Google, and increase criminal penalties for people violating these laws.
However, Australia is not the only nation seeking to refresh and update its laws to counter what many claim is the growing tide of abusive and extremist material online. In the United States (U.S.), there have been calls from both political parties and many stakeholders to revise 47 U.S.C. 230 (aka Section 230), the liability shield many technology companies have to protect them from litigation arising from content they allow others to post. In the United Kingdom, the Secretary of State for Digital, Culture, Media and Sport (DCMS) and the Secretary of State for the Home Department (Home Department) presented the Johnson government’s response to the “Online Harms White Paper” that “set out the extensive evidence of illegal and harmful content and activity taking place online…[and] highlighted the prevalence of the most serious illegal harms which threaten our national security and the physical safety of children.” This response sketched the contours of the goals and parameters of a forthcoming Online Safety Bill.
In the supplemental explanatory memorandum on the “Online Safety Bill of 2021,” the Parliament explained:
- The purpose of the Online Safety Bill (the Bill) is to create a new framework for online safety for Australians.
- The Bill, together with the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021, will create a modern, fit for purpose regulatory framework that builds on the strengths of the existing legislative scheme for online safety. In particular, the Bill will:
- retain and replicate provisions in the Enhancing Online Safety Act 2015 (EOSA) that are working well to protect Australians from online harms, such as the non-consensual sharing of intimate images scheme;
- articulate a core set of basic online safety expectations to improve and promote online safety for Australians;
- reflect a modernised online content scheme to replace the schemes in Schedules 5 and 7 of the Broadcasting Services Act 1992 (BSA) to address harmful online content;
- create a new complaints-based, removal notice scheme for cyber-abuse being perpetrated against an Australian adult;
- broaden the cyber-bullying scheme to capture harms occurring on services other than social media;
- reduce the timeframe for service providers to respond to a removal notice from the eSafety Commissioner from 48 to 24 hours;
- bring providers of app distribution services and internet search engine services clearly into the remit of the new online content scheme;
- establish a specific and targeted power for the eSafety Commissioner to request or require internet service providers (ISPs) to disable access to material depicting, promoting, inciting or instructing in abhorrent violent conduct, for time-limited periods in crisis situations, reflecting industry’s call for Government leadership on this issue.
- This supplementary explanatory memorandum responds to concerns raised by the Senate Environment and Communications Legislation Committee in its report of its Inquiry into the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021 of 12 March 2021.
- The Government amendments include:
- additional transparency requirements for the Commissioner’s annual report under clause 183 to include reports on investigations and the use of powers under the Bill;
- new provisions requiring the Commissioner to establish an internal review scheme for decisions made by the Commissioner under the Bill and publish the scheme on the Commissioner’s website; and
- amendments to the Explanatory Memorandum to address matters raised in consideration of the Bill by the Senate Environment and Communications Legislation Committee.
In the explanatory memorandum, the Parliament explained the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021
is part of a legislative package, including the Online Safety Bill 2021, which will create a stronger online safety framework for Australians. The Bill deals with:
Consequential matters arising from the enactment of the Online Safety Bill;
Increases to the maximum penalty for using a carriage service to menace, harass or cause offence under section 474.17 of the Criminal Code Act 1995 (Criminal Code) from three years’ imprisonment to five years’ imprisonment; and
Amendments to the aggravated offences and special aggravated offences in sections 474.17A and 474.17B of the Criminal Code to address issues of alignment and proportionality with the amendments to section 474.17 of the Criminal Code.
Schedule 1 to the Bill repeals the Enhancing Online Safety Act 2015 because, when passed, the Online Safety Bill will provide for much of the framework currently under the Enhancing Online Safety Act 2015, as well as provide additional powers to the Commissioner to keep Australians safe online.
Schedule 2 contains consequential amendments to other Acts arising from the enactment of the Online Safety Bill.
Schedule 3 contains transitional provisions for matters relating to the Commissioner, including the Commissioner’s appointment, powers, obligations, investigations, liabilities, protections and applicability of notices at the time of enactment of the Online Safety Bill. Schedule 3 also contains application provisions relating to amendments in the Criminal Code.
Before this legislation was considered in the House, the Senate’s Environment and Communications Legislation Committee reported on both measures after receiving submissions from interested parties, many of whom pointed to flaws or shortcomings in the bills. The committee recommended that the bills be passed and:
The committee recommends that the Australian Government consider amending the Explanatory Memorandum to the Online Safety Bill 2021 to clarify that the requirement for an industry code to be registered within six months is for best endeavours and that the Commissioner has the discretion to work with industry over whatever timeframe is deemed necessary to achieve an effective outcome.
As mentioned, a number of stakeholders took issue with the previous drafts of the bills. Google offered the following recommendations, some of which were accepted by the government:
- Government should acknowledge that there is a shared responsibility to foster online safety between industry, government, parents/carers, NGOs and civil society.
- The focus of Basic Online Safety Expectations (BOSE) should be on practical best efforts and overall processes, while avoiding being overly prescriptive.
- Any preemptive and preventative action recommended under the BOSE should be coupled with a ‘Good Samaritan’ framework that incentivises companies to take these proactive measures without risking liability for occasional missteps in that process.
- Transparency reporting requirements should be flexible, and, if there are to be any sanctions attached to them, they should focus on systemic failures.
- Any expansion to the scope of services subject to both the cyber bullying and cyber abuse schemes should be carefully limited and tailored, recognising relevant differences between services. Rules that make sense for social networks, for instance, do not necessarily make sense for other types of platforms or services.
- If the cyber abuse scheme were to be extended to adults, it is crucial that the definition of relevant content be tied to the Criminal Code.
- Regarding removal turnaround times, we strongly suggest that a more workable standard would be one that instructed online platforms to remove content “with all due speed,” “without undue delay,” or “expeditiously” and without a fixed 24 hour turnaround. We also call attention to the numerous comments made by the eSafety Commissioner that businesses typically do respond expeditiously to requests to remove content.
- The proposed accreditation scheme for safety tools does not provide clear utility. It would entail considerable resources to set up and administer, and would be very slow.
- On the subject of blocking terrorist and extreme violent material online, appropriate legislative instruments already exist to address these issues efficiently, and, to the extent any new instruments are introduced, it is essential that they be narrowly tailored to address only those ‘worst of the worst’ platforms and services that willfully and systematically fail to respond to valid legal removal requests regarding specific items of identified content.
- For ancillary services, any additional powers should specifically focus on notice-and-takedown of specific illegal material.
- In the context of governance, any increase in the powers and responsibilities of the Office of the eSafety Commissioner should be accompanied by a formal framework of multi-stakeholder oversight into the policy direction and decisions being made by the Office.
Australia’s Centre for Responsible Technology made these recommendations:
- Develop a more sophisticated and broader definition of online safety, working with civic groups, independent policy bodies, experts and academia to use devices like the Online Safety Spectrum to create more holistic and effective online safety solutions.
- Place more emphasis on the responsibility of the online platforms in addressing online harms.
- Allow the public to view reports and determinations on compliance with standards, releasing regular reports on current and past incidents from online platforms.
- Make full use of their ability to develop industry codes and standards.
- Specifically address the growing issue of disinformation as part of the online safety landscape, more actively promote and develop the codes and actions around disinformation, including the voluntary code for disinformation currently being developed.
The Australian Lawyers Alliance (ALA) stated:
The ALA submits that this Bill should not proceed in its current form as it invests excessive discretionary power in the eSafety Commissioner and also the Minister with respect to the considerations of community expectations and values in relation to online content. The ALA considers that the Bill does not strike the appropriate balance between protection against abhorrent material and due process for determining whether content comes within that classification.
Red Files Inc., “a non-for-profit charity that aims to prevent violence against and exploitation of sex workers in Australia,” argued:
- Part 9 of The Bill inadvertently contravenes its primary goals and purpose of protecting Australians from harmful online content. This is because Part 9 relates to, amongst other things, online pornography depicting consenting adults. Red Files fears Part 9 will result in a chain of unintended consequences that will do more harm than good – namely the removal of porn from all social media platforms or even blocking Australian users from internet services.
- Part 9 is inconsistent with the rest of The Bill, which relates to cyber bullying, revenge porn, cyber abuse material and online material that promotes violence, and as such should be removed entirely.
The Office of the Australian Information Commissioner (OAIC) asserted:
- Division 7 of Part 9 of the Bill sets out a framework for the development of industry codes and industry standards for sections of the online industry. ‘Sections of the online industry’ include social media services, relevant electronic services, providers of designated internet services, internet search engine services and app distribution services (see cl 135).
- Clause 138 of the Bill sets out examples of matters that may be managed through industry codes and industry standards. The examples include, but are not limited to, ‘procedures directed towards the achievement of the objective of ensuring that online accounts are not provided to children without the consent of a parent or responsible adult.’
- There is some alignment between these definitions and proposed content and the Government’s proposed binding online privacy code, which will apply to social media platforms and other online platforms that trade in personal information.3 The privacy code will require these entities to:
- be more transparent about data sharing
- meet best practice consent requirements when collecting, using and disclosing personal information
- stop using or disclosing personal information on request, and
- comply with specific rules to protect the personal information of children and vulnerable groups.
- The OAIC considers that these alignments present an opportunity to address issues of common concern through a coordinated online safety and data protection regime. Ongoing consultation and cooperation between the OAIC and the Office of the eSafety Commissioner will continue to ensure that the distinct but complementary roles of privacy and online safety work effectively and comprehensively to address online risks and harms.
In December 2020, the Department of Infrastructure, Transport, Regional Development, and Communications (Department) published a draft “Online Safety Bill” for consultation that would modify four existing statutes that aim to protect people online and introduce a new regulatory scheme. The Department claimed in its press statement the legislation:
- The provisions in the Enhancing Online Safety Act 2015 (EOSA) that are working well to protect Australians from online harms, such as the image-based abuse scheme;
- A set of core basic online safety expectations for social media services, relevant electronic services and designated internet services, clearly stating community expectations, with mandatory reporting requirements;
- An enhanced cyberbullying scheme for Australian children to capture a range of online services, not just social media platforms;
- A new cyber abuse scheme for Australian adults, to facilitate the removal of serious online abuse and harassment;
- A modernised online content scheme, to replace the schemes in Schedules 5 and 7 of the Broadcasting Services Act 1992 (BSA). The Bill will create new classes of harmful online content and will reinvigorate out of date industry codes to address such content;
- New abhorrent violent material blocking arrangements that allow the eSafety Commissioner to respond rapidly to an online crisis event such as the Christchurch terrorist attacks, by requesting internet service providers block access to sites hosting seriously harmful content; and
- Consistent take-down requirements for image-based abuse, cyber abuse, cyberbullying and harmful online content, requiring online service providers to remove such material within 24 hours of receiving a notice from the eSafety Commissioner.
In a Reading Guide, the Department asserted “[t]he Bill proposes five schemes to deal with different types of harmful online material. Four already exist in law (but are being appropriately updated)…[and] [o]ne is new – the adult cyber abuse scheme:
- Cyber-bullying Scheme – Provides for the removal of material that is harmful to Australian children. This scheme reflects the current regime in the Enhancing Online Safety Act (EOSA), however reduces the take-down time for such material from 48 hours to 24 hours and extends the scheme to more services.
- Adult Cyber-abuse Scheme – Provides for the removal of material that seriously harms Australian adults. This scheme is new. It extends similar protections in the cyber-bullying scheme to adults, however with a higher threshold of ‘harm’ to reflect adults’ higher levels of resilience.
- Image-based Abuse Scheme – Provides for the removal of intimate images shared without the depicted person’s consent. This scheme reflects the current regime in the EOSA, however reduces the take-down time for such material from 48 hours to 24 hours.
- Online Content Scheme – Provides for the removal of harmful material in certain circumstances. This scheme reflects and simplifies the current regime in Schedules 5 and 7 of the BSA, with some clarifications of material and providers of services captured by the scheme, and extending the eSafety Commissioner’s take-down powers for some material to international services in some circumstances.
- Abhorrent Violent Material Blocking Scheme – Provides for the blocking of abhorrent violent material, such as images or video of terrorist attacks. This scheme is new, but mirrors existing legislation in the Criminal Code Act 1995 (the Criminal Code).
Not surprisingly, under the bill, providers of online services and materials will have increased obligations. The Department stated “[t]he Basic Online Safety Expectations (BOSE) framework is an enhancement of the basic online safety requirements, coupled with new powers for the eSafety Commissioner to require service providers to report on compliance with the BOSE.” The Department explained that BOSE “will include, in legislation, core expectations that:
- End-users are able to access services in a safe manner;
- The extent of harmful material is minimized;
- Technological or other measures are in effect to prevent access by children to class 2 materials; and
- There are clear and readily identifiable mechanisms that enable end-users to report and make complaints about harmful material.
© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.