The Australian eSafety Commissioner (eSafety) has commenced civil penalty proceedings against social media giant X, formerly known as Twitter, for failing to comply with government requirements regarding child sexual exploitation materials. In February, eSafety issued a transparency notice to several social media companies, requesting them to provide information about how they were addressing child sexual exploitation and abuse materials and activities on their platforms. The commission alleged that X did not comply with the notice by not preparing a report in the required manner and form, and by failing to respond truthfully and accurately to some questions in the notice. X also inadequately disclosed the number of safety and public policy staff remaining after tech billionaire Elon Musk acquired the platform in October 2022 and implemented several rounds of job cuts.
While other social media platforms, including Google, did not respond well to the transparency notice, eSafety found X’s non-compliance very serious. In September, eSafety issued a $610,500 fine to X and gave the company 28 days to request the withdrawal of the infringement notice or pay the penalty. X did not pay the fine nor requested a withdrawal, opting instead for a judicial review of eSafety’s transparency and infringement notices. eSafety is now seeking to have the judicial review heard while commencing civil penalty proceedings against X at the same time.
Other Social Media Platforms Not Doing Enough
While X is the only tech company fined, eSafety found that other social media platforms also did not do well in tackling child sexual exploitation materials in Australia. “Our first report featuring Apple, Meta, Microsoft, Skype, Snap, WhatsApp, and Omegle uncovered serious shortfalls in how these companies were tackling this issue,” eSafety Commissioner Julie Inman Grant said in an October statement. “This latest report also reveals similar gaps in how these five tech companies are dealing with the problem and how they are tackling the rise in sexual extortion, and we need them all to do better. What we are talking about here are serious crimes playing out on these platforms committed by predatory adults against innocent children, and the community expects every tech company to be taking meaningful action.” Among the platforms, Discord did not take any measures to detect child sexual exploitation in live streams, citing “prohibitively expensive” costs.
Google only used such technology on YouTube, but not on Chat, Gmail, Meet, and Messages. Furthermore, Google and Discord did not block links to known child sexual exploitation materials nor used technology to detect grooming in some or all of their services.