Online Safety (Basic Online Safety Expectations) Determination 2022
made under section 45 of the
Online Safety Act 2021
Compilation No. 1
Compilation date: 31 May 2024
Includes amendments: F2024L00590
Registered: 27 June 2024
About this compilation
This compilation
This is a compilation of the Online Safety (Basic Online Safety Expectations) Determination 2022 that shows the text of the law as amended and in force on 31 May 2024 (the compilation date).
The notes at the end of this compilation (the endnotes) include information about amending laws and the amendment history of provisions of the compiled law.
Uncommenced amendments
The effect of uncommenced amendments is not shown in the text of the compiled law. Any uncommenced amendments affecting the law are accessible on the Register (www.legislation.gov.au). The details of amendments made up to, but not commenced at, the compilation date are underlined in the endnotes. For more information on any uncommenced amendments, see the Register for the compiled law.
Application, saving and transitional provisions for provisions and amendments
If the operation of a provision or amendment of the compiled law is affected by an application, saving or transitional provision that is not included in this compilation, details are included in the endnotes.
Editorial changes
For more information about any editorial changes made in this compilation, see the endnotes.
Modifications
If the compiled law is modified by another law, the compiled law operates as modified but the modification does not amend the text of the law. Accordingly, this compilation does not show the text of the compiled law as modified. For more information on any modifications, see the Register for the compiled law.
Self‑repealing provisions
If a provision of the compiled law has been repealed in accordance with a provision of the law, details are included in the endnotes.
Contents
Part 1—Preliminary
1 Name
3 Authority
4 Definitions
Part 2—Basic online safety expectations
Division 1—Purpose of this Part
5 Purpose of this Part
Division 2—Expectations regarding safe use
6 Expectations—provider will take reasonable steps to ensure safe use
7 Expectations—provider will consult with Commissioner and refer to Commissioner’s guidance in determining reasonable steps to ensure safe use
8 Additional expectation—provider will take reasonable steps regarding encrypted services
8A Additional expectations—provider will take reasonable steps regarding generative artificial intelligence capabilities
8B Additional expectations—provider will take reasonable steps regarding recommender systems
9 Additional expectation—provider will take reasonable steps regarding anonymous accounts
10 Additional expectation—provider will consult and cooperate with other service providers to promote safe use
Division 3—Expectations regarding certain material and activity
11 Core expectation—provider will take reasonable steps to minimise provision of certain material
12 Core expectation—provider will take reasonable steps to prevent access by children to class 2 material
Division 4—Expectations regarding reports and complaints
13 Expectations—provider will ensure mechanisms to report and make complaints about certain material
14 Additional expectations—provider will ensure service has terms of use, certain policies etc.
15 Expectations—provider will ensure service has mechanisms to report and make complaints about breaches of terms of use
16 Additional expectation—provider will make accessible information on how to complain to Commissioner
Division 5—Expectations regarding making certain information accessible
17 Additional expectation—provider will make information on terms of use, policies and complaints etc. accessible
18 Additional expectation—provider will provide updates about changes in policies, terms and conditions etc.
Division 6—Expectations regarding record keeping
19 Additional expectation—provider will keep records regarding certain matters
Division 7—Expectations regarding dealings with the Commissioner
20 Expectations—provider will provide requested information to the Commissioner
21 Additional expectations—provider will have designated contact point
Endnotes
Endnote 1—About the endnotes
Endnote 2—Abbreviation key
Endnote 3—Legislation history
Endnote 4—Amendment history
Endnote 5—Editorial changes
This instrument is the Online Safety (Basic Online Safety Expectations) Determination 2022.
This instrument is made under section 45 of the Online Safety Act 2021.
In this instrument:
Act means the Online Safety Act 2021.
Part 2—Basic online safety expectations
Division 1—Purpose of this Part
For the purposes of subsections 45(1), (2) and (3) of the Act, this Part specifies the basic online safety expectations for the following:
(a) a social media service;
(b) a relevant electronic service of any kind;
(c) a designated internet service of any kind.
Note: Subsections 6(1) and 7(1), section 11, subsections 12(1), 13(1) and 15(1), and section 20 of this instrument are made in accordance with subsection 46(1) of the Act (core expectations).
Division 2—Expectations regarding safe use
6 Expectations—provider will take reasonable steps to ensure safe use
Core expectation
(1) The provider of the service will take reasonable steps to ensure that end‑users are able to use the service in a safe manner.
Additional expectation
(2) The provider of the service will take reasonable steps to proactively minimise the extent to which material or activity on the service is unlawful or harmful.
Additional expectation
(2A) The provider of the service will take reasonable steps to ensure that the best interests of the child are a primary consideration in the design and operation of any service that is likely to be accessed by children.
Examples of reasonable steps that could be taken
(3) Without limiting subsection (1), (2) or (2A), reasonable steps for the purposes of those subsections could include the following:
(a) developing and implementing processes to detect, moderate, report and remove (as applicable) material or activity on the service that is unlawful or harmful;
(b) if a service or a component of a service (such as an online app or game) is likely to be accessed by children (the children’s service) – ensuring that the default privacy and safety settings of the children’s service are robust and set to the most restrictive level;
(c) ensuring that persons who are engaged in providing the service, such as the provider’s employees or contractors, are trained in, and are expected to implement and promote, online safety;
(d) continually improving technology and practices relating to the safety of end‑users;
(e) ensuring that assessments of safety risks and impacts are undertaken (including child safety risk assessments), identified risks are appropriately mitigated, and safety review processes are implemented, throughout the design, development, deployment and post‑deployment stages for the service;
(f) assessing whether business decisions will have a significant adverse impact on the ability of end‑users to use the service in a safe manner and in such circumstances, appropriately mitigating the impact;
(g) having staff, systems, tools and processes to action reports and complaints within a reasonable period of time in accordance with subsection 14(3);
(h) investing in systems, tools and processes to improve the prevention and detection of material or activity on the service that is unlawful or harmful;
(i) having processes for detecting and addressing hate speech which breaches a service’s terms of use and, where applicable, breaches a service’s policies and procedures and standards of conduct mentioned in section 14;
(j) preparing and publishing regular transparency reports that outline the steps the service is taking to ensure that end‑users are able to use the service in a safe manner, including:
(i) the use of online safety tools and processes;
(ii) providing metrics on the prevalence of material or activity on the service that is harmful;
(iii) the service’s responsiveness to reports and complaints; and
(iv) how the service is enforcing its terms of use, policies and procedures and standards of conduct mentioned in section 14.
Additional expectation
(5) The provider of the service will take reasonable steps to make available controls that give end‑users the choice and autonomy to support safe online interactions.
Examples of reasonable steps that could be taken
(6) Without limiting subsection (5), reasonable steps for the purposes of that subsection could include the following:
(a) making available blocking and muting controls for end‑users;
(b) making available opt‑in and opt‑out measures regarding the types of content that end‑users can receive;
(c) enabling end‑users to make changes to their privacy and safety settings.
Core expectation
(1) In determining what are reasonable steps for the purposes of subsection 6(1), the provider of the service will consult the Commissioner.
Additional expectation
(2) In addition, in determining what are reasonable steps for the purposes of subsection 6(1), the provider of the service will have regard to any relevant guidance material made available by the Commissioner.
Note: The Commissioner may, from time to time, publish specific guidance issued to all service providers. Guidance material published by the Commissioner may include information disclosed to it under subsection 7(2), but will not include information that is commercial‑in‑confidence or which the disclosing provider does not consent to being published.
8 Additional expectation—provider will take reasonable steps regarding encrypted services
(1) If the service uses encryption, the provider of the service will take reasonable steps to develop and implement processes to detect and address material or activity on the service that is unlawful or harmful.
(2) Subsection 8(1) does not require the provider of the service to undertake steps that could do the following:
(a) implement or build a systemic weakness, or a systemic vulnerability, into a form of encrypted service;
(b) build a new decryption capability in relation to encrypted services; or
(c) render methods of encryption less effective.
(1) If the service uses or enables the use of generative artificial intelligence capabilities, the provider of the service will take reasonable steps to consider end‑user safety and incorporate safety measures in the design, implementation and maintenance of generative artificial intelligence capabilities on the service.
(2) If the service uses or enables the use of generative artificial intelligence capabilities, the provider of the service will take reasonable steps to proactively minimise the extent to which generative artificial intelligence capabilities may be used to produce material or facilitate activity that is unlawful or harmful.
Examples of reasonable steps that could be taken
(3) Without limiting subsection (1) or (2), reasonable steps for the purposes of this section could include the following:
(b) providing educational or explanatory tools (including when new features are integrated) to end‑users that promote understanding of generative artificial intelligence capabilities on the service and any risks associated with the capabilities;
(c) ensuring, to the extent reasonably practicable, that training material for generative artificial intelligence capabilities and models do not contain unlawful or harmful material;
(d) ensuring, to the extent reasonably practicable, that generative artificial intelligence capabilities can detect and prevent the execution of prompts that generate unlawful or harmful material.
8B Additional expectations—provider will take reasonable steps regarding recommender systems
(1) If the service uses recommender systems, the provider of the service will take reasonable steps to consider end‑user safety and incorporate safety measures in the design, implementation and maintenance of recommender systems on the service.
(2) If the service uses recommender systems, the provider of the service will take reasonable steps to proactively minimise the extent to which recommender systems amplify material or activity on the service that is unlawful or harmful.
Examples of reasonable steps that could be taken
(3) Without limiting subsection (1) or (2), reasonable steps for the purposes of this section could include the following:
(c) enabling end‑users to make complaints or enquiries about the role recommender systems may play in presenting material or activity on the service that is unlawful or harmful;
(d) where technically feasible, enabling end‑users to opt‑out of receiving recommended content, or providing alternative curation options.
9 Additional expectation—provider will take reasonable steps regarding anonymous accounts
Additional expectation
(1) If the service permits the use of anonymous accounts, the provider of the service will take reasonable steps to prevent those accounts being used to deal with material, or for activity, that is unlawful or harmful.
Examples of reasonable steps that could be taken
(2) Without limiting subsection (1), reasonable steps for the purposes of that subsection could include the following:
(a) having processes, including proactive processes, that prevent the same person from repeatedly using anonymous accounts to post material, or to engage in activity, that is unlawful or harmful;
(b) having processes that require verification of identity or ownership of accounts.
(1) The provider of the service will take reasonable steps to:
(a) consult and cooperate with providers of other services; and
(b) ensure consultation and cooperation occurs between all relevant services provided by that provider, in order to promote the ability of end‑users to use all of those services in a safe manner.
Examples of reasonable steps that could be taken
(2) Without limiting subsection (1), reasonable steps for the purposes of that subsection could include the following:
(a) working with other service providers and between all relevant services provided by a service provider to detect high volume, cross‑platform attacks (also known as volumetric or ‘pile‑on’ attacks);
(b) sharing information with other service providers and between all relevant services provided by a service provider on material or activity on the service that is unlawful or harmful, for the purpose of preventing and dealing with such material or activity.
Division 3—Expectations regarding certain material and activity
11 Core expectation—provider will take reasonable steps to minimise provision of certain material
The provider of the service will take reasonable steps to minimise the extent to which the following material is provided on the service:
(a) cyber‑bullying material targeted at an Australian child;
(b) cyber‑abuse material targeted at an Australian adult;
(c) a non‑consensual intimate image of a person;
(d) class 1 material;
(e) material that promotes abhorrent violent conduct;
(f) material that incites abhorrent violent conduct;
(g) material that instructs in abhorrent violent conduct;
(h) material that depicts abhorrent violent conduct.
Core expectation
(1) The provider of the service will take reasonable steps to ensure that technological or other measures are in effect to prevent access by children to class 2 material provided on the service.
Examples of reasonable steps that could be taken
(2) Without limiting subsection (1) of this section, reasonable steps for the purposes of that subsection could include the following:
(a) implementing appropriate age assurance mechanisms;
(b) conducting child safety risk assessments.
(c) continually seeking to develop, support or source, and implement improved technologies and processes for preventing access by children to class 2 material.
Division 4—Expectations regarding reports and complaints
13 Expectations—provider will ensure mechanisms to report and make complaints about certain material
Core expectation
(1) The provider of the service will ensure that the service has clear and readily identifiable mechanisms that enable end‑users to report, and make complaints about, any of the following material provided on the service:
(a) cyber‑bullying material targeted at an Australian child;
(b) cyber‑abuse material targeted at an Australian adult;
(c) a non‑consensual intimate image of a person;
(d) class 1 material;
(e) class 2 material;
(f) material that promotes abhorrent violent conduct;
(g) material that incites abhorrent violent conduct;
(h) material that instructs in abhorrent violent conduct;
(i) material that depicts abhorrent violent conduct.
Additional expectation
(2) The provider of the service will ensure that the service has clear and readily identifiable mechanisms that enable any person ordinarily resident in Australia to report, and make complaints about, any of the following material provided on the service:
(a) cyber‑bullying material targeted at an Australian child;
(b) cyber‑abuse material targeted at an Australian adult;
(c) a non‑consensual intimate image of a person;
(d) class 1 material;
(e) class 2 material;
(f) material that promotes abhorrent violent conduct;
(g) material that incites abhorrent violent conduct;
(h) material that instructs in abhorrent violent conduct;
(i) material that depicts abhorrent violent conduct.
14 Additional expectations—provider will ensure service has terms of use, certain policies etc.
(1) The provider of the service will ensure that the service has:
(a) terms of use; and
(b) policies and procedures in relation to the safety of end‑users; and
(c) policies and procedures for dealing with reports and complaints mentioned in section 13 or 15; and
(d) standards of conduct for end‑users (including in relation to material that may be posted using the service by end‑users, if applicable), and policies and procedures in relation to the moderation of conduct and enforcement of those standards.
Note 1: See section 17 in relation to making this information accessible to end‑users.
Note 2: For paragraph (b), the policies and procedures might deal with the protection, use and selling (if applicable) of end users’ personal information.
(1A) The provider of the service will take reasonable steps (including proactive steps) to detect breaches of its terms of use and, where applicable, breaches of policies and procedures in relation to the safety of end‑users, and standards of conduct for end‑users.
(2) The provider of the service will take reasonable steps (including proactive steps) to ensure that any penalties specified for breaches of its terms of use, policies and procedures in relation to the safety of end‑users, and standards of conduct for end‑users, are enforced against all accounts held or created by the end‑user who breached the terms of use and, where applicable, breached the policies and procedures, and standards of conduct, of the service.
(3) The provider of the service will, within a reasonable period of time:
(a) review and respond to reports and complaints mentioned in sections 13 and 15; and
(b) take reasonable steps to provide feedback on the action taken.
(4) For the purposes of subsection (3), in determining ‘a reasonable period of time’, the provider must have regard to:
(a) the nature and impact of the harm that is the subject of the report or complaint;
(b) the complexity of investigating the report or complaint; and
(c) any other relevant matters.
(5) For the purposes of paragraph (3)(a):
(a) review means considering a report or complaint from when it is first made; and
(b) respond means taking and implementing a decision to have content removed and reported, have an end‑user banned, or other content moderation decisions, or a decision to take no action.
Core expectation
(1) The provider of the service will ensure that the service has clear and readily identifiable mechanisms that enable end‑users to report, and make complaints about, breaches of the service’s terms of use.
(2) The provider of the service will ensure that the service has clear and readily identifiable mechanisms that enable any person ordinarily resident in Australia to report, and make complaints about, breaches of the service’s terms of use and, where applicable, breaches of the service’s policies and procedures and standards of conduct mentioned in section 14.
The provider of the service will ensure that information and guidance on how to make a complaint to the Commissioner, in accordance with the Act, about any of the material mentioned in section 13 provided on the service, is readily accessible to end‑users.
Division 5—Expectations regarding making certain information accessible
(1) The provider of the service will ensure that the information specified in subsection (2) is:
(a) readily accessible to end‑users; and
(b) in relation to the information mentioned in paragraph (2)(b)—accessible at all points in the end‑user experience, including, but not limited to, point of purchase, registration, account creation, first use and at regular intervals (as applicable); and
(c) regularly reviewed and updated; and
(d) written in plain language.
(2) For the purposes of subsection (1), the information is the following:
(a) the terms of use, policies and procedures and standards of conduct mentioned in section 14;
(b) information regarding online safety and parental control settings, including in relation to the availability of tools and resources published by the Commissioner.
The provider of the service will ensure that end‑users receive updates written in plain language in relation to changes in the information specified in subsection 17(2), including through targeted in‑service communications.
Division 6—Expectations regarding record keeping
19 Additional expectation—provider will keep records regarding certain matters
The provider of the service will keep records of reports and complaints about the material mentioned in section 13 provided on the service for 5 years after the making of the report or complaint to which the record relates.
Division 7—Expectations regarding dealings with the Commissioner
20 Expectations—provider will provide requested information to the Commissioner
Core expectations
(1) If the Commissioner, by written notice given to the provider of the service, requests the provider to give the Commissioner a statement that sets out the number of complaints made to the provider during a specified period (not shorter than 6 months) about breaches of the service’s terms of use, the provider will comply with the request within 30 days after the notice of request is given.
(2) If the Commissioner, by written notice given to the provider of the service, requests the provider to give the Commissioner a statement that sets out, for each removal notice given to the provider during a specified period (not shorter than 6 months), how long it took the provider to comply with the removal notice, the provider will comply with the request within 30 days after the notice of request is given.
(3) If the Commissioner, by written notice given to a provider of the service, requests the provider to give the Commissioner specified information relating to the measures taken by the provider to ensure that end‑users are able to use the service in a safe manner, the provider will comply with the request within 30 days after the notice of request is given.
Additional expectation
(4) If the Commissioner, by written notice given to a provider of the service, requests the provider to give the Commissioner a report on the performance of online safety measures that relevant providers have announced publicly or reported to the Commissioner, the provider will comply with the request within 30 days after the notice of request is given.
Additional expectation
(5) If the Commissioner, by written notice given to a provider of the service, requests the provider to give the Commissioner a report on the number of active end‑users of the service in Australia (disaggregated into active end‑users who are children and those who are adult end‑users) during a specified period, the provider will comply with the request within 30 days after the notice of request is given.
21 Additional expectations—provider will have designated contact point
(1) The provider of the service will ensure that there is an individual who is:
(a) an employee or agent of the provider; and
(b) designated as the service’s contact point for the purposes of the Act.
Note: The provider of the service is expected to have a designated contact point regardless of whether the service has staff physically located in Australia.
(2) The provider will ensure that the following: contact details of the contact point are notified to the Commissioner:
(a) an email address; and
(b) a phone number or voice chat address.
(3) If there is a change to the identity or contact details of the individual designated as the service’s contact point for the purposes of the Act, the provider will give the Commissioner written notice of the change within 14 days after the change.
The endnotes provide information about this compilation and the compiled law.
The following endnotes are included in every compilation:
Endnote 1—About the endnotes
Endnote 2—Abbreviation key
Endnote 3—Legislation history
Endnote 4—Amendment history
Abbreviation key—Endnote 2
The abbreviation key sets out abbreviations that may be used in the endnotes.
Legislation history and amendment history—Endnotes 3 and 4
Amending laws are annotated in the legislation history and amendment history.
The legislation history in endnote 3 provides information about each law that has amended (or will amend) the compiled law. The information includes commencement details for amending laws and details of any application, saving or transitional provisions that are not included in this compilation.
The amendment history in endnote 4 provides information about amendments at the provision (generally section or equivalent) level. It also includes information about any provision of the compiled law that has been repealed in accordance with a provision of the law.
Editorial changes
The Legislation Act 2003 authorises First Parliamentary Counsel to make editorial and presentational changes to a compiled law in preparing a compilation of the law for registration. The changes must not change the effect of the law. Editorial changes take effect from the compilation registration date.
If the compilation includes editorial changes, the endnotes include a brief outline of the changes in general terms. Full details of any changes can be obtained from the Office of Parliamentary Counsel.
Misdescribed amendments
A misdescribed amendment is an amendment that does not accurately describe how an amendment is to be made. If, despite the misdescription, the amendment can be given effect as intended, then the misdescribed amendment can be incorporated through an editorial change made under section 15V of the Legislation Act 2003.
If a misdescribed amendment cannot be given effect as intended, the amendment is not incorporated and “(md not incorp)” is added to the amendment history.
ad = added or inserted | o = order(s) |
am = amended | Ord = Ordinance |
amdt = amendment | orig = original |
c = clause(s) | par = paragraph(s)/subparagraph(s) |
C[x] = Compilation No. x | /sub‑subparagraph(s) |
Ch = Chapter(s) | pres = present |
def = definition(s) | prev = previous |
Dict = Dictionary | (prev…) = previously |
disallowed = disallowed by Parliament | Pt = Part(s) |
Div = Division(s) | r = regulation(s)/rule(s) |
ed = editorial change | reloc = relocated |
exp = expires/expired or ceases/ceased to have | renum = renumbered |
effect | rep = repealed |
F = Federal Register of Legislation | rs = repealed and substituted |
gaz = gazette | s = section(s)/subsection(s) |
LA = Legislation Act 2003 | Sch = Schedule(s) |
LIA = Legislative Instruments Act 2003 | Sdiv = Subdivision(s) |
(md) = misdescribed amendment can be given | SLI = Select Legislative Instrument |
effect | SR = Statutory Rules |
(md not incorp) = misdescribed amendment | Sub‑Ch = Sub‑Chapter(s) |
cannot be given effect | SubPt = Subpart(s) |
mod = modified/modification | underlining = whole or part not |
No. = Number(s) | commenced or to be commenced |
Name | Registration | Commencement | Application, saving and transitional provisions |
Online Safety (Basic Online Safety Expectations) Determination 2022 | 23 Jan 2022 (F2022L00062) | 24 Jan 2022 (s 2(1) item 1) |
|
Online Safety (Basic Online Safety Expectations) Amendment Determination 2024 | 30 May 2024 (F2024L00590) | 31 May 2024 (s 2) | — |
Provision affected | How affected |
Part 1 |
|
s 2..................... | rep LA s 48D |
Part 2 |
|
Division 2 |
|
s 6..................... | am F2024L00590 |
| ed C1 |
s 8..................... | am F2024L00590 |
s 8A.................... | ad F2024L00590 |
s 8B.................... | ad F2024L00590 |
s 9..................... | am F2024L00590 |
s 10.................... | am F2024L00590 |
Division 3 |
|
s 12.................... | am F2024L00590 |
Division 4 |
|
s 14.................... | am F2024L00590 |
s 15.................... | am F2024L00590 |
Division 7 |
|
s 20.................... | am F2024L00590 |
| ed C1 |
s 21.................... | am F2024L00590 |
In preparing this compilation for registration, the following kinds of editorial change(s) were made under the Legislation Act 2003.
Subsection 6(3)
Kind of editorial change
Give effect to the misdescribed amendment as intended and change to capitalisation
Details of editorial change
Schedule 1 item 2 of the Online Safety (Basic Online Safety Expectations) Amendment Determination 2024 instructs to omit “without limiting subsection (1) or (2), reasonable steps for the purposes of this section” and substitute “without limiting subsection (1), (2) or (2A), reasonable steps for the purposes of those subsections” in subsection 6(3).
The text “without limiting subsection (1) or (2), reasonable steps for the purposes of this section” does not appear in subsection 6(3). However, the text “Without limiting subsection (1) or (2), reasonable steps for the purposes of this section” does appear.
The substituted text appears at the start of a sentence and should begin with a capital letter.
This compilation was editorially changed to omit “Without limiting subsection (1) or (2), reasonable steps for the purposes of this section” and substitute “Without limiting subsection (1), (2) or (2A), reasonable steps for the purposes of those subsections” in subsection 6(3) to give effect to the misdescribed amendment as intended and to correct the capitalisation.
Subsection 20(5)
Kind of editorial change
Give effect to the misdescribed amendment as intended
Details of editorial change
Schedule 1 item 17 of the Online Safety (Basic Online Safety Expectations) Amendment Determination 2024 provides as follows:
Additional expectation
(5) If the Commissioner, by written notice given to a provider of the service, requests the provider to give the Commissioner a report on the number of active end‑users of the service in Australia (disaggregated into active end‑users who are children and those who are adult end‑users) during a specified period, the provider will comply with the request within 30 days after the notice of request is given.
The instruction to insert the subsection is missing.
This compilation was editorially changed to insert subsection 20(5) after subsection 20(4) to give effect to the misdescribed amendment as intended.