November 26, 2024:
I rise to speak on the Online Safety Amendment (Social Media Minimum Age) Bill 2024. I'm a mum to three teenage children, and I know that parents across Mackellar find themselves in a similar predicament to me: trying to help our children navigate their childhood and teen years in the digital era—not an easy task. I'm also a member of a generation which is so pleased to have lived through its formative years without a screen, and without social media in particular. Social media encourages us to constantly compare ourselves to others and can serve up unsolicited, harmful content, neither of which are conducive to good mental health. It's difficult for everyone. Technology changes far more quickly than policy, academic analysis, legislation and governments do. It's hard for everyone to keep up, but federal governments are elected to do hard things, and this should be no different.
Before getting to the bill itself, I would like to commend the government for tackling this challenging but incredibly important topic in its first term, as the need is urgent. I think it's important to recognise that the intention here is to protect our children and young people from harm. This, of course, is commendable. Let's be clear: there are significant harms being dished up to our children every day as they scroll through social media. There is eating disorder content, violent pornography, the normalisation of misogyny, and gambling, alcohol and junk food promotions specifically targeted to them. A 2023 eSafety Commissioner report found that 75 per cent of all 16- to 18-year-olds surveyed in Australia had seen online pornography, and almost one-third of those had encountered it before the age of 13. It found that 60 per cent of young people were exposed to pornography on social media, often unintentionally, on popular sites such as TikTok, Instagram and Snapchat . All the while, young peoples' online activity and data is being harvested by these large corporations, with thousands of labels being attached to them, and this data is sold to innumerable companies for the purpose of targeted advertising. I recently conducted a social media survey in the electorate of Mackellar to get a feel for what people were thinking about this social media ban for those under the age of 16. It was split down the middle for and against, but basically everyone agreed on one thing: something must be done to protect young people online. However, many felt that the ban that we are debating today was too blunt an instrument, would not be effective and could be easily circumvented. Some argued that, rather than introducing a ban, other measures should be introduced to make the online environment safer. One constituent summed up this issue by writing:
Nice idea in principle—but would more likely ending up cutting off vulnerable youth from support services. Unfortunately, there are some mental health and medical resources that only exist on social media. The EU is considering legislation that would prohibit algorithms dictating your feed, instead having social media only give information from users you actively follow. This may be a better compromise.
I now seek to introduce the second reading amendment circulated in my name. This second reading amendment highlights the problems with this bill. They can be summarised quite easily. Firstly, there are no grandfathering provisions. It is entirely unclear how platforms will be required to manage the many millions of existing users who are now set to be excluded and deplatformed—those children under 16 who are already on the designated platforms, who have been on them for years and who will now, in theory, be required to step away from them.
Secondly, the legislation is far too vague in stipulating how social media platforms are to comply with their obligation to prevent under 16s from having an account, stating only that it will likely involve:
… some form of age assurance, as a means of identifying whether a prospective or existing account holder is an Australian child under the age of 16 years.
For what is supposed to be world-leading legislation, this doesn't seem adequate.
Thirdly, under-16s will still be able to watch videos on YouTube and see content on Facebook. The ban is only designed to stop them from creating an account. Without legislation in place that imposes a duty of care on social media platforms to ensure that harmful content cannot be shared, young people will still be at risk of harm whilst supposedly relaxing in their bedrooms. I do acknowledge that work has commenced to create duty-of-care legislation to compel social media platforms to prevent harm from occurring on their sites. But, of course, as with all things political and legislative, there is no guarantee that this legislation will come to fruition for years, if at all. A case in point is the gambling advertising ban bill, which, despite being so urgent and necessary, has been kicked off into the long grass by the government this week as being all too hard and all too scary with the looming election.
In much more general terms, however, I remain concerned that this bill has the potential to actually negatively impact children, by severing their access to vital mental health resources and social supports, which are critical for the very large number of young people already experiencing mental health challenges. There are also significant privacy concerns. Population-wide age verification raises serious concerns about data security and privacy, affecting both minors and adults.
We also know that age bans trialled in 10 other countries have failed. I acknowledge that technology has moved on since many of these trials were conducted, but it is also inevitable that many children under 16 years old will be able to bypass the age restrictions, thereby finding themselves inhabiting unregulated, darker spaces that they would otherwise not have found themselves in. This may have a chilling effect on young people. They may find themselves in even more harmful spaces than they would otherwise have been in, were it not for the ban. Being on these platforms illegally may mean they don't seek the support and help they need because they're worried about the legal repercussions—that is, getting into trouble.
Fundamentally, though, the problem with simply evicting children from the platforms is that it absolves the social media companies of their responsibility to create safe, well-designed services for young users. Our laws should demand that platforms take reasonable steps to resolve issues of safety and addictive design while also offering better experiences for children. That's where the danger of the superficial appeal of this legislation becomes most clear. Basically, this legislation, which simply bans those under 16 from using social media, means that social media companies won't be compelled to address and solve the actual causes of harm on their platforms—both the algorithms that serve up harmful content to teenagers and the design principles which ensure addiction.
According to Daniel Angus, a professor of digital communication and director of the Queensland University of Technology's Digital Media Research Centre, there are several things that the government should be doing to better avert harms that I have listed. The first recommendation is to impose a duty of care on digital platforms. The government has announced they're looking into this, which is welcome news.
My crossbench colleague the member for Goldstein introduced a private members bill just this week which would establish a digital duty of care. I commend her for doing this. The member for Goldstein's contention, which many digital experts agree with, is that rather than age gating, we need safety by design. That means, instead of simply shutting kids out, we need to address and prevent the harm being caused in the first place. This is especially important when the government is far from clear on how age gating will work.
This leads to the second recommendation from Daniel Angus, which is to enact regulations which will require platforms to give people more control over what content they see. Users should be able to change, reset or turn off their personal algorithm. The third recommendation is to create a children's online privacy code to better protect children's information online. This could be administered by the Children's Commissioner or the eSafety Commissioner or both.
These issues were comprehensively canvassed by a group of over 140 Australian and international experts who wrote an open letter to the Prime Minister last month. The view of that group is that an age based ban on accessing social media is too blunt an instrument to address the risks effectively.
I understand that this bill has been referred to the Senate Environment and Communications Legislation Committee. In my short time here, I've always found Senate inquiries, which provide the opportunity for further expert scrutiny, inevitably result in improved legislation and outcomes. I particularly hope that the committee is actively seeking input from the children this bill seeks to regulate. The experts are telling us that children want safe online environments in which to socialise and be entertained. I look forward to reading the committee's report, which is actually due out today, and I will carefully review their recommendations as well as the input from my community of Mackellar, as ever, in making my final decision about whether to support this bill.
I move:
That all words after "House" be omitted with a view to substituting the following words:
"does not decline to give the bill a second reading and notes:
(1) the Joint Select Committee on Social Media and Australian Society tabled its final report on 18 November 2024 and carefully considered, but did not recommend, an age ban on social media;
(2) the Australian Human Rights Commission has serious reservations about the ban;
(3) in October 2024, over 140 Australian and international experts wrote an open letter to the Prime Minister and State and Territory leaders explaining their view that an age-based ban on accessing social media is too blunt an instrument to address the risks effectively;
(4) an aged-based social media ban could have unintended consequences including increased isolation and children being pushed to unregulated and dangerous platforms; and
(5) the imposition of a 'duty of care' on digital platforms is more likely to reduce online harm for all users regardless of age and should be implemented as a matter of urgency".