Australia’s New Social Media Age Limits: Twitch Joins the Restricted Platforms
In a meaningful move to enhance online safety for young users, Australia is set to enforce new social media age restrictions that will affect those under 16 years old.The country’s eSafety regulator has recently included Twitch among the platforms subject to these rules. Starting December 10,Australians younger than 16 will be prohibited from creating new accounts on Twitch,with existing accounts belonging to underage users scheduled for deactivation by January 9.
Defining Australia’s Social media Minimum Age Framework
The Australian government’s Social Media minimum Age (SMMA) policy targets digital services that enable direct social interaction and real-time engagement between users. Due to its interactive live-streaming features,Twitch falls squarely within this category. In contrast, platforms like Pinterest have been exempted because they primarily function as visual finding tools rather than spaces for active social networking.
How Australian Rules Differ from Global platform Policies
Internationally, Twitch permits individuals aged 13 and above to join its community; however, minors often require parental consent depending on jurisdictional laws. Australia’s updated regulations raise the minimum age threshold specifically within its borders, imposing stricter limitations compared to the platform’s global standards.
List of Platforms Affected by Australia’s Under-16 Social Media Ban
- Snapchat
- TikTok
- X (formerly Twitter)
- YouTube (excluding YouTube Kids and Google Classroom)
- Kick (Australian streaming service)
- Twitch (recently added)
This legislation mandates these platforms prevent account creation or access by anyone under sixteen starting December 10. The initiative stems from laws enacted roughly a year ago aimed at shielding children from harmful online content through enforced age verification processes.
A Global Viewpoint: Comparing Online Safety Regulations for minors
Nations worldwide are intensifying efforts to regulate children’s access in digital environments. In the United States alone, as of mid-2025, twenty-four states have passed legislation requiring some form of online age verification before granting access or downloads involving minors.Utah notably led this trend by mandating app stores verify user ages prior to allowing downloads targeted at younger audiences.
The United Kingdom introduced its Online Safety Act in July 2025 which compels platforms hosting sensitive content-such as material related to self-harm or eating disorders-to implement stringent age checks protecting users under eighteen or face significant penalties.
The Importance of Self-Assessment Tools in regulatory Compliance
The eSafety Commission provides a self-assessment tool designed for digital services globally so they can determine if they fall within SMMA requirements. This resource assists companies in understanding their obligations regarding user age restrictions well before enforcement deadlines arrive.
The Growing need for Stricter Controls Amid Changing Digital Habits Among Youths
Youth participation on interactive platforms like live-streaming services has surged dramatically over recent years-raising concerns about risks such as cyberbullying and exposure to inappropriate content. For instance, recent research indicates nearly 70% of teenagers worldwide engage with some form of live video streaming weekly-a figure underscoring why regulators prioritize tighter controls around highly interactive environments compared with less socially intensive sites like Pinterest where user interaction is minimal.




