The power of online content is currently concentrated in the hands of a few major tech companies, according to the Lowy Institute. In order to address this issue, the Institute suggests establishing platform councils consisting of both everyday citizens and tech experts as a way to moderate online content, including de-platforming.
In an executive summary, the Institute raised concerns about the significant influence of a handful of multinational corporations over most digital platforms in democracies. The author of the report, researcher Lydia Khalil, indicated that when power is concentrated in the hands of a select few, there is minimal accountability to the population, and instead suggested the involvement of ordinary citizens in making regulatory decisions.
The report, published on Feb. 21, was funded by the New South Wales government as part of a Digital Threats to Democracy Project. The report outlines how ordinary users and tech experts can be involved in platform councils, allowing for a more legitimate consensus on the uses and governance of digital platforms and enabling the sharing of responsibility and risk for content moderation and user access among technology companies, government, and the population.
The report further emphasizes the necessity for ordinary citizens to contribute to regulatory decisions and suggests that a similar approach could be applied to inform government regulation on AI and other new technologies. The Institute argues that technocratic solutions are insufficient and that digital deliberative democracy has proven to be both legitimate and popular.
Conversation around online regulation is already taking place in Australia, where the federal government is making efforts to crack down on social media companies. The eSafety Commissioner issued a fine to Elon Musk’s X in September 2023 over the company’s alleged failure to address child sexual abuse materials online. However, the company applied for a judicial review of the fine and is yet to pay it.
The Australian government is also considering legislation to combat misinformation and disinformation, with a proposed bill receiving significant public scrutiny. The government is delaying the introduction of the legislation until 2024, as it refines the bill to address concerns, including potential protections for religious expression. The opposition has expressed fundamental opposition to the bill, which includes provisions for imposing fines for allegations of misinformation and disinformation. These ongoing developments reflect the complexity and sensitivity of addressing online content regulation in democratic societies.