Facebook parent company Meta Thursday announced that it took down three state-linked “domestic influence operations” in the past quarter. Each operation used fake account networks to influence domestic politics and target opposition parties.
Operations linked to ruling parties and governments in Cuba, Serbia and Bolivia were found by Meta investigators to be engaging in a variety of “coordinated inauthentic behaviors” (CIBs). Each group operated through a number of social media platforms, including Meta platforms like Facebook, Instagram and WhatsApp.
In Cuba, fake accounts with AI-generated profile pictures ran pages and groups on Facebook and posted content across a slew of platforms, including memes of opposition figures and critics referring to them as “worms,” a derogatory term used by the government for dissidents.
A similar CIB network in Serbia linked to employees of the Serbian Progressive Party, known as the party’s Internet Team, likewise used fake account networks and coordinated posts to “create a perception of widespread and authentic grassroots support for Serbian President Aleksander Vučić and the Serbian Progressive party.”
In Bolivia, coordinated adversarial activities linked to the current government and Movimiento al Socialismo (MAS Party) formed a “blended operation” of CIB tactics and mass reporting, wherein account networks abused platforms’ moderation tools to flag, remove and suppress opposition figures. Nimmo disclosed to AFP that one Facebook page reach two million followers before it was shut down.
Meta’s increasing attempts to moderate and limit content that promotes harm and violates human rights come in light of the intense criticism the company has faced in recent years for its perceived inability and unwillingness to police similar networks in the past.
Meta has faced lawsuits which allege that the company was culpable for the dissemination of hateful and dangerous misinformation, most notably regarding its role in the genocide of Rohingya Muslims in Myanmar. Damning whistleblower reports also asserted that Meta (then Facebook) had for years been warned internally that its failure to police abusive content would result in violence.