Germany, Sept 28 (US): Days before Germany’s federal elections, Facebook took what it described as an unprecedented step: removing a series of accounts that had worked together to spread misinformation about COVID-19 and encourage violent responses to COVID restrictions.
The crackdown, announced on September 16, was the first use of Facebook’s new “coordinated social harm” policy aimed at stopping state-sponsored disinformation campaigns, but ordinary users made increasingly sophisticated efforts to bypass rules around hate speech or disinformation.
In the case of the German network, nearly 150 accounts, pages and groups have been linked to the so-called Querdenken movement, a loose coalition that has protested against lockdown measures in Germany and includes opponents of vaccines and masks, conspiracy theorists and some far-right. extremists.
Facebook described the move as an innovative response to potentially harmful content; Far-right commentators condemned it as censorship. But a review of the removed content — as well as the many Querdenken posts that are still available — reveals that Facebook’s work is mediocre at best. At worst, critics say, it could have been a ploy to counter complaints that it’s not doing enough to stop harmful content.
Concluded researchers at Reset, a non-profit organization based in the United Kingdom, has criticized the role of social media in democratic discourse.
Facebook regularly updates journalists about the accounts it removes under policies that ban “coordinated inauthentic behavior,” a term it coined in 2018 to describe groups or people working together to mislead others. It has since removed thousands of accounts, most of which it said were bad actors trying to interfere in elections and politics in countries around the world.
But there were limitations, because not all malicious behavior on Facebook is “inauthentic”; There are a lot of real groups using social media to incite violence and spread misinformation and hate. So the company was limited by its policy on what it could remove.
But even with the new rule, there’s still a problem with removals: they don’t spell out what harmful material is left on Facebook, making it hard to say what the social network is accomplishing.
Case in point: the Querdenken Network. The reset was already monitoring accounts that had been removed by Facebook and issued a report that concluded that only a small portion of Querdenken-related content had been deleted while many similar posts were allowed to remain.
The dangers of COVID-19 extremism were underlined days after Facebook announced when a young German worker at a gas station was shot dead by a man who refused to wear a mask. The suspect followed several far-right users on Twitter and expressed negative opinions about immigrants and the government.
Facebook initially declined to provide examples of the content Querdenken had removed, but eventually released four publications to the Associated Press that were no different from the content still available on Facebook. It included a post falsely stating that vaccines create new viral variants and wishful death for the police which has dispersed violent protests against COVID restrictions.
Reset’s analysis of the comments removed by Facebook found that many of the comments were actually written by people trying to refute Querdenken’s arguments, and did not include misinformation.
Facebook defended the action, saying the account removals were never intended to be a blanket ban on Querdenken, but instead a carefully considered response to users who were working together to break its rules and post malicious content.
Facebook plans to fine-tune and expand its use of the new policy in the future, according to David Agranovich, Facebook’s director of Global Threat Disruption.
“This is a start,” he told The Associated Press on Monday. “We are expanding our network disruption model to address new and emerging threats.”
This approach seeks to strike a balance between allowing diverse opinions and preventing the spread of harmful content, Agranovic said.
The new policy could represent a significant change in the platform’s ability to counter harmful speech, according to Cliff Lamb, an information professor at the University of Michigan who studies social media.
“They have tried in the past to crush cockroaches, but there is always more,” he said. “You could spend the whole day stomping your feet and you wouldn’t get anywhere. The pursuit of nets is a smart endeavor.”
Simon Heiglich, a professor of political science at the Technical University of Munich, said that while the removal of the Querdenken network might be justified, it should raise questions about Facebook’s role in democratic debates.
Hegelich said Facebook appears to be using Germany as a “test case” for the new policy.
“Facebook is really interfering in German politics,” Hegelich said. “The COVID situation is one of the biggest issues in the election. They may be right that there is a lot of misinformation on these sites, but it is nonetheless a very political issue, and Facebook is interfering in it.”
Members of the Querdenken movement reacted angrily to Facebook’s decision, but many also said they weren’t surprised.
“The big deletion continues,” wrote a supporter in a Querdenken Facebook group still active, “See you on the street.”