X Pays Hefty Price for Lack of Transparency in Australia

Samanta Blumberg

Oct-16-2023

X Pays Hefty Price for Lack of Transparency in Australia

Australia's eSafety Commission, a regulatory body that oversees online safety, recently fined X, the platform formerly known as Twitter, $386,000. This stems from X's inability to adequately outline its efforts toward detecting and reporting child sexual abuse material (CSAM). It's a major concern for the platform that prides itself on user safety under the operational guidance of techno-maestro Elon Musk. Following Elon Musk's acquisition, a streamlined content moderation strategy was put in place aimed at enhancing user safety. However, the strategy X uses to counter CSAM seems shrouded in mystery, raising the commission's suspicion. Notably, CSAM detection is a key priority for X, and Musk has consistently emphasized the strides made in this regard. However, instead of substantive replies to the commission's queries on its detection efforts, X maintained a concerning silence. The eSafety Commission's investigation revealed alarming stats about X. During the three months post its change in ownership, its proactive identification of CSAM took a dip by 15%, moving from a commendable 90% to a worrying 75%. Despite this, X countered this finding by claiming an improvement in their detection rate in 2023. Despite these claims, X evaded questions related to their response rate on such disturbing reports and measures laid out to detect child exploitation in live streams. Moreover, the information regarding tech tools used and their safety, along with the public policy staff count following the acquisition, was not satisfyingly detailed. These glaring voids in response and callous approach continued to feed into the narrative of X cutting corners on their moderation obligations. This behavior, in turn, culminated in the hefty fine imposed upon X by the Australian Commission. The immediate implications of these findings remain uncertain for X and its reputation. The fine corresponds more to X's lack of transparency than its detection failure, which might not directly impact X's standing among its users. However, it certainly casts a long shadow on the platform's revised moderation agendas, designed under Musk's direction. This could deter potential advertisers, damaging the financial prospects of Musk's ambitious project. Not to forget the other elephant in the room, X's tech giant peer, Google, that finds itself tangled in a similar, albeit lighter, transgression related to its detection tools. This only goes on to imply that the road ahead for online platforms, in terms of online safety, is a tough one to tread on.

Follow: