Meta’s (META.O) oversight board announced Friday it will examine two cases on how the media giant handled potentially misleading posts shared ahead of the Australian Voice referendum last year. Meta (formerly known as the Facebook company) is the social media company that owns and operates Facebook and Instagram.
In October 2023, two Facebook users separately posted screenshots of partial information shared by the Australian Electoral Commission (AEC) on X (formerly known as Twitter), according to the Oversight Board. The screenshots shared information about the issue of individuals voting more than once and included the following language: ” If someone votes at two different polling places within their electorate, and places their formal vote in the ballot box at each polling place, their vote is counted.” The information shared also concerned the secrecy of the ballot. However, the posts shared in both cases contained only part of the information that was shared by the AEC in a longer series of interconnected posts, including that multiple voting is an offense of electoral fraud.
In the first case, the Facebook user accompanied the posts with the caption which stated, “So it is official. Go out, vote early, vote often and vote NO.” The second case shared similar AEC information with text overlay that stated “[t]hey are setting us up for a ‘Rigging’… smash the voting centres people it’s a NO, NO, NO, NO, NO.”
Meta said the posts were proactively identified, sent for human review, and then consequently removed for violating Meta’s Coordinating Harm and Promoting Crime policy. The policy prohibits “statements that advocate, provide instructions or show explicit intent to illegally participate in a voting or census process.” Additionally, it prohibits “facilitating, organizing, or admitting to certain criminal or harmful activities” and does not allow threats of violence against a place if it could “lead to death or serious injury of any person that could be present at the targeted place.”
In the examination of these cases, the Board stated it looking for public comments that address:
the socio-historical context of the 2023 Indigenous Voice to Parliament Referendum in Australia
Any relevant context or history of voter fraud in Australia
The spread of voter fraud-related content, and false or misleading information about voting, elections and constitutional referenda across social media platforms
Content moderation policies and enforcement practices, including fact-checking, on misleading, contextualised and/or voter fraud-related content.
The cases were selected “to examine Meta’s content moderation policies and enforcement practices on false or misleading voting information and voter fraud, given the historic number of elections in 2024,” said the Oversight Board.
In October last year, Australians rejected a proposal to recognize the country’s First Nation people in the Australian Constitution through establishing an Aboriginal and Torres Strait Islander Voice. In the time since the referendum defeat, concerns have been raised that the referendum was affected by a bombardment of misinformation and disinformation leading up to the vote.
Human rights advocates have appealed for stronger regulation of social media platforms to address the spread of disinformation and misinformation through social media. The Human Rights Law Centre (HRLC) responded to the outcome of the referendum, “calling for strong laws to prevent an exponential spread of disinformation and misinformation from taking over our democracy.”
The decision made by the Oversight Board to either uphold or reverse Meta’s content decisions will be binding and they may also issue policy recommendations, which Meta must respond to within 60 days. The Oversight Board will deliberate the cases over the next few weeks and will post their final decisions on their website.