Zuckerberg’s Meta comes under attack from human rights campaigners
Mark Zuckerberg’s Meta has been accused by human rights campaigners of “neglecting” a key online safety initiative designed to catch fake news and misinformation.
The social network’s parent company, Meta, was said to have left its Trusted Partner programme “significantly under-resourced and understaffed”.
Internews, a non-profit group that works to boost media freedoms, said Meta’s lack of resources had led to “operational failures” and high response times when taking down misinformation campaigns.
The tech giant has axed more than 20,000 staff after chief executive Mark Zuckerberg embarked on a “year of efficiency” to cut costs and boost the company’s flagging share price.
The mass job cuts prompted fears the company’s content moderation could suffer as a result, although Internews said many of the issues predated the layoffs.
The report said: “Given the issues… cost-saving cuts would further jeopardise the programme and worsen the Trusted Partners experience.”
Internews said complaints about misinformation on conflicts, such as the Tigray War in Ethiopia, sometimes went unanswered for months, although warnings about content spreading in Ukraine were typically dealt with quickly.
“Meta’s partners are deeply frustrated with how the programme has been run,” said Rafiq Copeland of Internews. “We and other Trusted Partners stand ready to work productively with Meta to address the issues raised in this review.”
The tech giant’s Trusted Partners programme includes 645 charities, news groups and fact-checking organisations.
In response to the report, Meta said some findings did not “represent a full or accurate picture”.
In its responses included in the report, Meta said it was “working to develop new methods of sharing information about the overall impact and performance” of the programme, the Financial Times reported.
The Facebook owner also recognised the need for “clear reporting guidelines and tracking mechanisms for Trusted Partner reports”, and said it was developing “standard reporting templates, tailored for different harmful content types” to further facilitate feedback from partners.
Meta said: “We welcome constructive dialogue with any of our Trusted Partners.”
Internews launched a review of the programme after growing frustrated with Meta’s response times. It said Meta took 200 days to respond to its list of questions.
Meta said the Covid pandemic had increased its response times from 2020 to 2021, and that it expected reports to be handled within one to five days.
The tech giant faced criticism over its failures to stop the spread of misinformation and hate speech in 2017 in Myanmar, amid a surge of violence against the Rohingya minority group.
Meta has previously said it contributed $100m to create a global fact-checking network, while it has an estimated team of 15,000 content reviewers.