Issued by the authority of the Minister for Communications
Online Safety Act 2021
Online Safety (Age-Restricted Social Media Platforms) Rules 2025
Authority
The Online Safety (Age-Restricted Social Media Platforms) Rules 2025 (the Rules) is made by the Minister for Communications (the Minister) under the Online Safety Act 2021 (the Act).
Paragraph 240(1)(a) of the Act provides that the Minister may, by legislative instrument, make rules prescribing matters required or permitted by the Act to be prescribed by legislative rules.
Paragraph 63C(6)(b) of the Act provides that the Minister for Communications may, by legislative instrument, make rules that exclude specified electronic services from the definition of an ‘age-restricted social media platform’.
Purpose and operation
Section 63D of the Act, which requires age-restricted social media platforms to take reasonable steps to prevent users under 16 years from having accounts was introduced by the Online Safety Amendment (Social Media Minimum Age) Act 2024 (the SMMA Act).
The Rules specify electronic services that are not an age-restricted social media platform.
The Rules are a legislative instrument for the purposes of the Legislation Act 2003 and are subject to the default sunsetting requirements and disallowance.
Details of the instrument are set out in Attachment A.
Context
The SMMA Act, passed on 29 November 2024 with bipartisan support, introduces a minimum age for having an account on certain social media platforms. This landmark reform reflects Australians’ expectations for a strong regulatory response to addressing online harms experienced by children and young people on these services.
There is a growing evidence-base that indicates an association between social media use and harms to health, particularly due to features designed to induce users to expend increasing amounts of time on platforms, and expose children and young people to inappropriate content such as highly idealised and unhealthy social and body image comparisons. There are a range of design features associated with these harms, including personalised and algorithmically recommended content, endless content feeds (such as infinite scroll and auto-play), engagement prompts (such as notifications), quantifiable social metrics (such as the “like” feature), ephemeral content and time-sensitive rewards (such as stories and streaks), and emerging AI-driven features such as content modifications tools.
A recent study by the eSafety Commissioner examined a subset of data from more than 2,600 children and young people aged 10 15 from their Keeping Kids Safe Online survey, to understand the types of online harms children and young people experience. The study found that around 7 in 10 children and young people said they had encountered content associated with harm, including exposure to sexist, misogynistic, or hateful material, dangerous online challenges, violent fight videos and content promoting disordered eating. Additionally, 75% of this content associated with harm was most recently encountered on social media.
The American Psychology Association (APA) outlines several key indicators of addictive and problematic social media use, such as a persistent need to engage with social media despite a desire to engage less, excessive effort to maintain continuous access to social media, strong cravings for social media interaction and disruption to daily activities when social media access is unavailable.[1] These behaviours, driven by persistent notifications and targeted algorithmic features, disrupt critical physiological functions and impair children and young people’s ability to complete daily tasks and routines. Social media use has also been explicitly linked to poor sleep, and subsequently, higher depressive symptoms in children and young people.[2]
Children and young people exhibiting problematic social media use may also consistently exceed the duration of time they intend to use social media, engage in deceptive behaviours to secure access, and experience negative impacts on significant relationships or educational pursuits due to their social media engagement.[3] This behavioural pattern exhibits parallels with other forms of behavioural dependency, creating concern for young minds that are particularly susceptible to habit formation.
A study of nearly 12,000 children and young people over three years starting at age 9 to 10 has linked an increase in social media use to a rise in depressive symptoms. Daily social media use among study participants during these years surged from 7 minutes a day at age 9, to 74 minutes per day by age 13. This increase paralleled a reported 35 per cent jump in depressive symptoms in study participants.[4] These findings suggest that more time spent on social media during early adolescence may contribute to increased depressive symptoms over time.
The APA’s health advisory also explicitly correlates ‘using social media for social comparisons related to physical appearance’ with diminished body image and heightened depressive symptoms, particularly within female adolescent populations.[5] This constant self-evaluation has the potential to erode self-esteem and life satisfaction, and contribute to feelings of inadequacy in children and young people.
This is backed by the expansive UK Millennium Cohort Study, which surveyed almost 11,000 14-year-olds and established a correlation between increased social media use and dissatisfaction with body weight and depressive symptoms.[6] Specifically, the study indicated that adolescents engaging with social media for 5 or more hours a day were
31 per cent more likely to express dissatisfaction with their body weight. For children and young people undergoing significant physical and psychological development, the barrage of ‘thinspo’ content present on social media has been shown to lead to mental health challenges.
The social media minimum age framework is intended to address harmful impacts such as addictive behaviours caused by persuasive or manipulative design features, social isolation, sleep interference, poor mental and physical health (including unhealthy social comparisons and negative body image), low life-satisfaction and exposure to inappropriate and harmful content.
The social media minimum age framework will play a critical role in safeguarding young Australians from online harms, but is not the be-all or end-all solution. The framework is designed to complement existing online safety regulatory schemes under the Act, including the cyberbullying and image-based abuse reporting schemes, industry codes and standards, and Online Safety (Basic Online Safety Expectations) Determination 2022 (BOSE).
These schemes will continue to play a central role in combating cyberbullying, exposure to child-inappropriate online content (including pornography and violent material), child abuse and sexual exploitation material. Instances of these harms on age-restricted social media platforms are likely to be reduced by the SMMA Act.
While children and young people are exposed to a range of harms on social media, social media platforms can and do provide beneficial experiences, particularly when they are grounded in connection, learning, health, and support. Recognising this, the SMMA Act includes a rule-making power under subsection 63C(6)(b) to carve out services from the social media minimum age obligation.
The Rules, made under this power, are intended to focus the social media minimum age obligation on platforms known to be associated with the types of harms that are the subject of the SMMA Act, while excluding those services that pose fewer harms and which help children and young people to thrive. In doing so, children and young people will be exposed to less of the harms that are the subject of the SMMA Act, including excessive screen-time, social isolation, sleep interference, poor mental and physical health, and low life-satisfaction.
The exclusion of those services from the social media minimum age obligation also serves to reduce privacy and data impacts on the broader public. This is particularly the case for the Rules’ exclusions for professional and technical skills development platforms, and product review services.
No platform is absolutely free from harm. Even platforms specifically designed with children and young people’s safety in mind will have risks, whether inadvertent or as a result of exploitation by other users. In establishing these Rules, the Minister does not endorse those excluded platforms and services as ‘safe’. A critical view of the digital environment, informed by media literacy and education, will continue to play an important role for young Australians as they explore the online world.
The classes of services that have been specified in these Rules have been excluded from the social media minimum age obligation on the basis that they pose fewer harms to children and young people.
Importantly, these Rules are not ‘set and forget’. The rule-making power is intentionally flexible to allow the Minister to be responsive to technological evolutions and changes in the digital ecosystem. This means that where the Rules are found to no longer serve the objectives of the SMMA Act and the rule-making power to reduce the risk of harm to children and young people, they will be updated.
a. Regulatory Matters
Age-restricted social media platform
The social media minimum age obligation applies to an ‘age-restricted social media platform’, under section 63C of the Act. The definition of this term is modelled on the meaning of ‘social media service’ in section 13, with a modification to expand the ‘sole or primary purpose’ test to a ‘significant purpose’ test when examining whether a service enables online social interactions between 2 or more users.
The effect is that the scope of ‘age-restricted social media platform’ is wider than ‘social media service’. This is made clear by the note under subsection 63C(1) which provides that an age-restricted social media platform may be, but is not necessarily, a social media service under section 13 of the Act.
A service that is an age-restricted social media platform under section 63C may also be a relevant electronic service under section 13A or a designated internet service under section 14.
‘Sole’, ‘primary,’ and ‘significant’ purpose
As with the primary law under which this instrument is made, the Rules rely on ‘sole’, ‘primary,’ and ‘significant’ purpose tests in defining the relevant services. This serves two important purposes. Firstly, it mitigates the risk of regulated platforms expanding their services with the specific intent of avoiding regulatory capture. For example, a video-sharing platform that attaches a new messaging function should not be permitted to claim an exclusion as a ‘messaging service’, unless the messaging component becomes the sole or primary purpose. Inversely, the purpose tests would mean that excluded platforms that fundamentally change their service offerings could fall within the remit of the social media minimum age obligation where changes to their service offering also changes the platform’s sole or primary purpose. The purpose tests are therefore built to be responsive to changes and evolutions in the social media ecosystem.
In determining the sole, primary or significant purpose of a platform, regard should be had to, amongst other things, the features and functions of the platform; how these features and functions are deployed and how they influence user engagement, behaviour and experiences, and the actual, rather than simply stated, use of the platform. The platform’s espoused objectives may also be relevant, but should be given less weight than the platform’s features and functions and user experiences, and cannot be considered in isolation from other factors. This is because the way a particular service classifies or markets itself may or may not reflect community understanding and usage, and may not be consistent across various contexts or forums.
The eSafety Commissioner is primarily responsible for oversight of the implementation, monitoring and enforcement of the social media minimum age framework. The Information Commissioner also has a role in overseeing compliance with privacy aspects of the law. A range of powers are available to allow for effective monitoring and enforcement by the regulators. This includes information gathering powers, which would allow the eSafety Commissioner to seek information from platforms when considering whether the platform is an age-restricted social media platform. This is in addition to other sources of information available to the Commissioner through other regulatory mechanisms, such as the reporting schemes, industry codes and standards, and the BOSE, as well as research and other insights.
Review
The social media landscape, and digital technology more broadly, is fast moving and constantly evolving. A rapidly changing environment can mean that regulatory settings can become outdated quickly. This requires ongoing monitoring, to allow the Rules to remain fit-for-purpose and responsive to the risks to children and young people online.
Separately, section 239B of the Act requires a review of the social media minimum age framework within 2 years of effective commencement. The Rules will be considered as part of this review process.
Impact Analysis
The Office of Impact Analysis (OIA) has been consulted in relation to the Rules and an Impact Analysis is not required, as these rules do not create any additional impact beyond what has already been assessed in the Impact Analysis for the SMMA Act. OIA reference number: OIA24-08210.
Commencement
The Rules commence on the day after they are registered on the Federal Register of Legislation.
Consultation
Targeted Consultation
The Department of Infrastructure, Transport, Regional Development, Communications, Sport and the Arts (the Department) undertook targeted stakeholder consultation between 14 February and 21 March 2025. The Department received feedback from 104 stakeholders from industry, young Australians, parents and carers, mental health organisations, civil society, legal experts and child-development experts. Feedback received from 34 individual meetings, 6 roundtables and 38 written submissions informed advice to the Minister for Communications. The Department accepted every request to meet one-on-one with any stakeholder regarding the draft Rules.
Feedback on the draft Rules was mixed, but key themes in comments included:
Concern that the naming of YouTube in the draft Rules would create significant competition issues;
Calls for a ‘safety-by-design’ approach of setting a threshold for exclusion from the social media minimum age obligation based on the standard of safety features and functions of a service;
Calls to include a rule that captures services that support professional and technical skill development, and product review and technical support services
The Department considered all relevant issues raised during targeted stakeholder consultation and provided in written submissions before finalising advice to the Minister for Communications.
Advice of the eSafety Commissioner
The Act states that the Minister must seek and have regard to the advice of the eSafety Commissioner before lodging the Rules.
The Hon Anika Wells, Minister for Communications, wrote to Ms Julie Inman-Grant, eSafety Commissioner on 12 June 2025, formally seeking advice on the draft Rules.
The eSafety Commissioner provided written advice 19 June 2025. This advice was consistent with feedback provided by participants of the targeted consultation.
The eSafety Commissioner recommended 5 options in order of priority:
Option 1: That YouTube be removed from the draft Rules, and that no other specific services are named to future-proof the Rules;
Option 2: That the explanatory statement to the Rules provide guidance to support a shared understanding of the Government’s intention and avoid future enforcement challenges;
Option 3: That consideration is given to amending the draft Rules so they reflect both the purpose of the service, as well as its risk of harm;
Option 4: That consideration is given to introducing a new Rule to exclude lower-risk services that are appropriate for young children; and
Option 5: That implementation is monitored to identify any emerging challenges which should be addressed through further Rules.
The Rules and this explanatory statement give effect to the Government’s agreement to options 1 and 2. The Government agrees with option 5, and considers that the statutory review of the framework will not only fulfil this function, but also provide an appropriate opportunity to address options 3 and 4.
YouTube
Based on feedback received during consultation and advice of the eSafety Commissioner, YouTube is not included in the Rules and is therefore subject to the social media minimum age obligation in section 63D of the Act.
It is important to note that the social media minimum age obligation is framed as preventing age-restricted users from ‘having an account’. This places an obligation on platforms to stop Australian children and young people under 16 from creating and holding an account in their own right. It does not stop them from accessing content on the platform, if the content can be accessed in a ‘logged out’ state (i.e. without logging into an account or profile). Further, the obligation does not preclude a parent, carer, or educator from allowing a child to use an account held by that parent, carer or educator. Australians under the age of 16 will therefore retain the ability to access YouTube content for education and entertainment purposes.
YouTube Kids is unlikely to fall within scope of the definition of ‘age-restricted social media platform’ in its current form as it operates more like a video streaming service, without the same interactive features as YouTube. YouTube Kids is therefore unlikely to satisfy the ‘online social interaction’ criteria of the definition.
Statement of compatibility with human rights
The Rules are compatible with the human rights and freedoms recognised or declared under Section 3 of the Human Rights (Parliamentary Scrutiny) Act 2011. A full statement of compatibility is set out at Attachment B.
Attachment A
Details of the Online Safety (Age-Restricted Social Media Platforms) Rules 2025
Section 1 – Name
This section provides that the name of the instrument is the Online Safety (Age-Restricted Social Media Platforms) Rules 2025.
Section 2 – Commencement
This section provides for the instrument to commence on the day after it is registered on the Federal Register of Legislation.
Section 3 – Authority
This section provides that the instrument is made under the Online Safety Act 2021.
Section 4 – Definitions
This section provides that any reference to Act in the instrument is a reference to the Online Safety Act 2021.
Section 5 – Classes of services that are not age-restricted social media platforms
This section provides that specified classes of services are not age-restricted social media platforms for the purposes of paragraph 63C(6)(b) of the Act. The effect of this section is that the electronic services in each of the specified classes are not subject to the social media minimum age obligation in section 63D of the Act.
Paragraph (a): messaging, email, voice calling or video calling
Paragraph 5(1)(a) provides that services with the sole or primary purpose of enabling communication through messaging, email, voice calling or video calling are not age-restricted social media platforms.
The sole or primary purpose of enabling communication through messaging, email, voice calling or video calling should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
It is possible for a service to have additional purposes alongside the sole or primary purpose. However, only the sole or primary purpose would be used to assess whether a service is not subject to the social media minimum age obligation in section 63D of the Act.
For example, if a service contains a purpose such as location sharing, public chat rooms or public forum channels, then messaging, email, voice calling or video calling functions are unlikely to be the sole or primary purpose of the service.
While there are risks of harm to children and young people from messaging, email, voice calling and video calling, the intent of section 63D of the Act is to mitigate harms such as addictive behaviours caused by manipulative design features, social isolation, sleep interference, poor mental and physical health, and low life-satisfaction. These harms are not as prevalent on messaging, email, voice calling and video calling as on age-restricted social media platforms.
A service will not be a service with the sole or primary purpose of enabling communication through messaging, email, voice calling or video calling to the extent to which it is:
The exclusions of SMS and MMS services are included to avoid doubt. The social media minimum age obligation under section 63D of the Act are not intended to apply to SMS and MMS service.
Paragraph (b): online games
Paragraph 5(1)(b) provides that services with the sole or primary purpose of enabling users to play online games with other users are not age-restricted social media platforms.
The sole or primary purpose of enabling users to play online games with other users should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
It is possible for a service to have additional purposes alongside the sole or primary purpose. However, only the sole or primary purpose would be used to assess whether a service is not subject to the social media minimum age obligation in section 63D of the Act.
For example, if a service primarily enables end-users to post material on the service but also enables users to play online games with other users, as the hosting of this game is not the sole or primary purpose, the service would be considered an age-restricted social media platform for the purposes of section 63D of the Act.
However, if a service primarily enables users to play an online game with other users but also enables online social interaction between 2 or more end-users, as the hosting of the game is the primary purpose, the service would not be considered an age-restricted social media platform for the purposes of section 63D of the Act.
Additionally, if a service contains ancillary third-party features or functions within an online game such as using social media applications as third-party logins for cloud saves or matchmaking, but the primary purpose is still to the enable users to play online games with other users, then the service would not be considered an age-restricted social media platform for the purposes of section 63D of the Act.
While there are risks of harm to children and young people from online games, the intent of section 63D of the Act is to mitigate harms caused by persuasive design choices that may drive excessive engagement on social media platforms, and may undermine a child’s autonomy or control of their digital experience.
Online games are currently regulated under the National Classification Scheme and industry codes and standards, which provides information on the age suitability of online games through a combination of the classification regulatory regime and relevant consumer advice. Applying additional restrictions to online games would result in an overly burdensome regulatory approach. This also means there is less risk of harm associated with age-restricted users having accounts with online games, because children and their parents or carers can more readily decide to avoid potentially harmful content when using the services, and certain potentially harmful content will be restricted from being provided to children under an age for which the content has been assessed as being appropriate.
Gaming consoles and platforms also provide ‘parental controls’ which would allow parents or carers to set limits on screen time, restrict in-game purchases, block inappropriate or harmful content and monitor online interactions. This further reduces the risk of harm to age-restricted users from having accounts with online gaming services when compared to other kinds of social media platforms.
Paragraph (c): services that enable information about products or services
Paragraph 5(1)(c) provides that services with the sole or primary purpose of enabling users to share information about products or services are not age-restricted social media platforms. This information could include, but is not limited to, reviews, technical support and advice.
The sole or primary purpose of enabling users to share information about products or services should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
It is possible for a service to have additional purposes alongside the sole or primary purpose. However, only the sole or primary purpose would be used to assess whether a service is not subject to the social media minimum age obligation in section 63D of the Act.
Features of these services typically include discussion forums that enable users to post technical support, advice and reviews about a specific product or service. For example, a service may primarily feature forums where representatives from hardware vendors provide technical support on how to use a product from that vendor.
However, if a service features discussion forums that primarily enable users to discuss news, entertainment and other types of content in addition to sharing information about products or services, the service would be considered an age-restricted social media platform for the purposes of section 63D of the Act.
This class of services pose limited risks to children and young people and it was not the intent of section 63D of the Act to include these services.
Paragraph (d): services that enable engagement on professional networking or professional development
Paragraph 5(1)(d) provides that services with the sole or primary purpose of enabling end-users to engage in professional networking or professional development are not age-restricted social media platforms.
The sole or primary purpose of a service for professional networking or professional development should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
Features of these services typically include facilitating connections between professionals and/or mentors that offer professional insights, including a focus on collaboration, sharing knowledge, career development and/or growth. For example, a service may enable end-users to create a profile that outlines their professional background and career goals, allowing them to connect with potential employers or professional connections.
As these services are primarily used to build professional networks and the posting of materials generally do not take place anonymously, it is less likely that inappropriate or harmful content will be published. As such, this class of services pose limited risks to children and young people and it is not the intent of section 63D of the Act to include these services.
Paragraph (e): services that have the sole or primary purpose of supporting the education of end-users
Paragraph 5(1)(e) provides that services with the sole or primary purpose of supporting education of users are not age-restricted social media platforms.
A service will not be an age-restricted social media platform if the sole or primary purpose of the service allows children and young people to access tools that support learning and education.
The sole or primary purposes of supporting the education of users should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
Features of these services typically enable educators to distribute course materials, manage and track assignments and facilitate communication through announcements and discussion forums. Children and young people may also be able to use these services to access resources, submit work, collaborate with peers, and receive feedback on their work.
While these services are often integrated with other tools such as video conferencing, messaging and the ability to post material on the service, if their sole or primary purpose is to support the education of users, they are not intended to be captured by paragraph 63D of the Act.
Paragraph 5(1)(e) is also not intended to capture services that merely contain educative content rather than have the sole or primary purpose of supporting the education of users. For example, supporting the education of users is unlikely to be the sole or primary purpose of a video platform that hosts an array of content, but also includes tutorial-style videos covering history, science and math. While containing educational content, supporting the education of users is unlikely to be the sole or primary purpose of the service and the service would be considered an age-restricted social media platform for the purposes of section 63D of the Act.
Paragraph (f): services that have the sole or primary purpose of supporting the health of end-users
Paragraph 5(1)(f) provides that services with the sole or primary purpose of supporting the health of users are not age-restricted social media platforms. The sole or primary purpose of these services is to support users in managing and improving health. Paragraph 5(1)(f) is intended to capture access to both physical and mental health support.
The sole or primary purpose of supporting the health of users should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
These services are distinct from age-restricted social media platforms in their explicit focus on physical and mental health and are designed with dedicated features for health and wellbeing outcomes. For example, their features may enable users to track personal goals using activity trackers and in-app journals, access curated health information from evidence-based articles and connect with peers through moderated forums.
These features improve access to physical and mental health support and pose limited risk to children and young people. It is not intended that services with the sole or primary purpose of supporting the health of users be captured by the social media minimum age obligations in paragraph 63D of the Act.
Paragraph (g): services that have a significant purpose of facilitating communication between educational institutions and students or students’ families
Paragraph 5(1)(g) provides that services with a significant purpose of facilitating communication between educational institutions and students or students’ families are not age-restricted social media platforms. Paragraph 5(1)(g) is intended to capture early childhood, primary, secondary and tertiary education.
The significant purpose of facilitating communication between educational institutions and students or students’ families should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
These services are distinct from other services as the significant purpose of the service is to streamline administrative and educational interactions, enabling children and young people to get the educational support they need. Features may include messaging, announcements and integrated calendar and event scheduling capabilities to keep all parties informed of academic deadlines, and allow children and young people to easily connect with teachers and stay updated on school events and their academic progress.
A service may have additional purposes, such as providing a platform for payments, however if the service is primarily used to facilitate communication between educational institutions and students or students’ families, it is not intended to be captured by the social media minimum age obligation in paragraph 63D of the Act. These features improve access to educational support for children and young people and their families and pose limited risk.
Paragraph (h): services that have a significant purpose of facilitating communication between providers and users of health care services
Paragraph 5(1)(h) provides that services with a significant purpose of facilitating communication between providers and users of health care services are not age-restricted social media platforms. Paragraph 5(1)(f) is intended to capture physical and mental health care services.
The significant purpose of facilitating communication between providers and users of health care services should be determined by considering the experience of end-users of the service, rather than how its purpose might be characterised by the service provider.
Such services typically incorporate a range of features to support interactions between patients and health care professionals. For example, these services may provide telehealth consultation tools to facilitate virtual appointments via video or audio calls. Users may also be able to access prescription management features, allowing users to request refills virtually and receive electronic prescriptions. Additional features may include secure messaging for confidential exchanges of text, images and documents regarding appointments, test results and health advice. A service may have additional purposes, such as providing a platform for payments, however these services are distinct for their explicit focus on facilitating communication between providers and users of health care services. It is not intended that these services be captured by the social media minimum age obligations in paragraph 63D of the Act. Features of these services improve access physical and mental health care services for children and young people and pose limited risk.
Subsection 5(2) Provision and generation of advertising material
Subsection 5(2) provides that platforms that would otherwise meet the classes specified in subsection 5(1), do not fall under the ‘not age-restricted social media platform’ definition on the basis of an argument that the purpose of such a service is to sell advertising or generate revenue from advertising sales. This aligns with Subsection 63C(3) in the Act.
Attachment B
Prepared in accordance with Part 3 of the Human Rights (Parliamentary Scrutiny) Act 2011
Online Safety (Age-Restricted Social Media Platforms) Rules 2025
The Online Safety (Age-Restricted Social Media Platforms) Rules 2025 (the Rules) are compatible with the human rights and freedoms recognised or declared in the international instruments listed in section 3 of the Human Rights (Parliamentary Scrutiny) Act 2011.
Overview of the Rules
The Rules support the operation of the Online Safety Amendment (Social Media Minimum Age) Act 2024 (SMMA Act) by allowing children and young people under the age of 16 years to continue to have and create accounts with the specified services that predominantly provide experiences grounded in connection, education, health, professional development, and support. In particular, the Rules provide exemptions for services that:
a) have the sole or primary purpose of enabling end-users to communicate by means of messaging, voice calling or video calling
b) have the sole or primary purpose of enabling end‑users to play online games with other end‑users
c) have the sole or primary purpose of enabling end-users to share information (such as reviews, technical support or advice) about products or services
d) end-users use solely or primarily for business or for professional development
e) have the sole or primary purpose of supporting the education of end-users
f) have the sole or primary purpose of supporting the health of end-users
g) have a significant purpose of facilitating communication between educational institutions and students or students’ families
h) have a significant purpose of facilitating communication between providers of health care and people using those providers’ services
Human rights implications
The Rules engage the following rights:
The principle that the best interests of a child shall be a primary consideration in actions concerning children in Article 3 of the Convention on the Rights of a Child (CRC)
The right of the child to engage in play and recreational activities and to participate freely in cultural and artistic life in Article 31 of the CRC
The right to the highest attainable standard of physical and mental health, enshrined in Article 12 of the International Covenant on Economic, Social and Cultural Rights (ICESCR)
Principle that the best interests of the child shall be a primary consideration
Article 3(1) of the CRC provides that in all actions which concern children, the best interests of the child shall be a primary consideration. The principle requires legislative, administrative and judicial bodies to take active measures to protect children’s rights, promote their wellbeing and consider how children’s rights and interests are or will be affected by their decisions and actions. The Rules support the best interests of the child by not unduly restricting access to services while safeguarding from harms and ensuring children and young people have continued access to beneficial online activities, including connection with friends and family, access to community and support services, and participating in public life.
Right of the child to engage in play and recreational activities and to participate freely in cultural and artistic life
Article 31 of the CRC recognises the right of children to rest and leisure, to engage in play and recreational activities, and to participate freely in cultural life and the arts. States should support appropriate and equal opportunities for cultural, artistic, recreational and leisure activity. Importantly, Article 31 provides that the right of engagement in recreational activities should be appropriate to the age of the child. The Rules maintain opportunities for children and young people to connect with each other, by directing them towards specified classes of services that have lower risks of harm, including harm from the addictive nature of social media algorithms, and exposure to harmful content.
Right to the highest attainable standard of physical and mental health
Article 12(1) of the ICESCR recognises the right of everyone to the enjoyment of the highest attainable standard of physical and mental health. Article 12(2d) provides the right to the creation of conditions which would assure access to all medical service and medical attention in the event of sickness. The Rules support children and young people’s access to medical services and heath care by allowing continued access to any online services which have the sole or primary purpose of supporting the health of end-users or facilitating communication between providers of health care and people accessing their service. This includes access to mental health, counselling and other medical support.
The Rules also maintain children and young people’s access to less harmful digital services (e.g. messaging, gaming) to promote connection without exploiting the vulnerabilities of young users, and reducing the risk of isolating children and young people from their family or friends.
Conclusion
The Rules are compatible with the human rights and freedoms recognised or declared in the international instruments listed in section 3 of the Human Rights (Parliamentary Scrutiny) Act 2022, because it promotes the protection of human rights, particularly in consideration of the best interests of the child. This includes the right to engage in play and recreational activities and participate cultural and artistic life, and the right to the highest attainable standard of physical and mental health.
Any interference with human rights occasioned by the Rules is in pursuit of a legitimate objective. To the extent that it may limit human rights, those limitations are reasonable, necessary and proportionate to achieve the legitimate aims of the SMMA Act.
The Honourable Anika Wells MP
Minister for Communications
[1] American Psychological Association, Health Advisory on Social Media (2023) <https://www.apa.org/topics/social-media-internet/health-advisory-adolescent-social-media-use.pdf> 7.
[2]Yvonne Kelly et al, ‘Social Media Use and Adolescent Mental Health: Findings from the UK Millennium Cohort Study’ (2019) 3(9) EClinical Medicine 59-61.
[3]American Psychological Association, above n 1, 7.
[4] Jason M Nataga et al,’ Social Media Use and Depressive Symptoms During Early Adolescence’ (2025) 8(5) JAMA Netw Open 1.
[5]American Psychological Association, above n 1, 8.
[6]Yvonne Kelly et al, above n 2, 62.