Facebook VIP ‘cross-check’ program slammed by Oversight Board : NPR

Meta, Facebook’s parent company, seems more concerned with avoiding “provocative” of VIPs than balancing complex questions about free speech and safety, its watchdog said.

Josh Edelson/AFP via Getty Images

hide captions

switch captions

Josh Edelson/AFP via Getty Images

Meta, Facebook’s parent company, seems more concerned with avoiding “provocative” of VIPs than balancing complex questions about free speech and safety, its watchdog said.

Josh Edelson/AFP via Getty Images

According to Meta’s supervisory board, a Facebook and Instagram program that gives celebrities, politicians, and other high-profile users special treatment in the business interests of its parent company Meta rather than the stated purpose is to protect the user’s right to free speech.

“Meta’s cross-examination program favors power and influence users who have commercial value to Meta and because the structure does not meet Meta’s human rights responsibilities and company values, has profound implications for global users and civil society,” Thomas Hughes, director of the oversight board, said in the report.

The board said Meta seemed more concerned with avoiding “provocative” of VIPs and evading accusations of censorship than balancing complex questions about freedom of expression and safety.

It called for an overhaul of the “flawed” program in a report on Tuesday that included wide-ranging recommendations to bring the program in line with international principles and Meta’s stated values.

Meta said in a statement that it would review and respond to the board’s recommendations – which are non-binding – within 90 days.

The report came out more than a year after Meta asked Board — a team of law, human rights and journalism experts from around the world, convened and funded by the company through an independent trust fund — to review the program, called “cross check,” was emphasized in the whistleblower’s disclosure in The Wall Street Journal.

Under the program, Meta maintains a user list that includes politicians, celebrities and business partners – such as advertisers, health organizations and news publishers – who are eligible. for further review if their posts are suspected of violating the company’s rules against violence, hate speech, misinformation, nudity, and other topics. In some cases, their posts are completely exempt from Meta’s rules.

Materials provided for The Wall Street Journal revealed former President Donald Trump and his son Donald Trump Jr., Democratic Senator Elizabeth Warren of Massachusetts, conservative activist Candace Owens, Brazilian soccer star Neymar and even Meta Mark CEO Zuckerberg is among those on the VIP list.

The board finds the purpose of “cross-checking” to protect the vulnerable, but in fact benefits those in power

Meta says the program aims to solve a conundrum.

Facebook and Instagram users make billions of posts every day, which means the company relies on a combination of human reviewers and automated systems to find and remove content that violates their rules. But at that scale, it’s certain that some posts are found to be in violation of policy – what the company calls “false positives”.

Cross-checking aims to minimize miscommunication where the risk and potential harm of errors is greatest.

“Cross-checking systems are built to prevent potential over-execution errors and double-check cases, such as a decision that may require better understanding or may have a higher risk of error. “, Nick Clegg, Meta’s vice president of global affairs, Written last year when the company asked the board to consider it. For example, he cites violence awareness activists, journalists reporting from conflict zones, and posts from “high-visibility” pages or profiles that many people can see.

But the panel found in its report that “while Meta describes cross-checking as a program to protect critical and vulnerable voices, it appears to be structured and calibrated directly.” more responsive to business concerns.”

For example, the panel said that based on its vetting process, the main reason Meta considers posts from users on its cross-checklist “is to avoid provoking those” who might complain to senior executives or “create public controversy” for Meta.

The highest-priority correlation during cross-examination with concerns about managing business relationships suggests that the consequences Meta wanted to avoid were primarily business-related, not business-related, the report said. unrelated to human rights”.

Additionally, the company appears to prioritize avoiding “censorship awareness” over its other human rights responsibilities, the board said.

Essentially, the panel concluded the cross-examination program treated users unfairly, despite Meta’s claim that everyone was subject to the same rules.

“Cross-testing provides some users with better protection than others,” the report said.

Furthermore, the board said, the program represents users and content from the United States and Canada — the markets where Meta earns the most revenue per user — although the majority of the nearly 3 Facebook’s billion monthly users live elsewhere.

“Through a cross-check design, users in lucrative markets with a high risk of public relations implications for Meta enjoy greater protection of their content and expression than those in elsewhere,” the council said.

That inequality is exacerbated by a lack of transparency about program participants. The board said Meta will not share which profiles, pages and accounts – which the company calls “entities” – are subject to cross-examination, citing legal obligations to protect privacy. of the user.

“The board cannot adequately assess the extent to which the company is meeting its human rights responsibilities under the program or the profiles of entities that are warranted for enhanced review if they do not know the program is being implemented. how and exactly who benefits from it,” the report said.

The panel noted that over the past year, Meta has expanded cross-checking to look at content that has a high risk of incorrect removal, even if the people posting that content aren’t on the cross-checklist.

But it said the company’s limited ability to review content meant that many of its posts never received the same level of additional review as those from famous users of the program.

56 million views before football star Neymar’s rule-breaking post was taken down

The report also criticized Meta’s policy of letting potentially rule-breaking posts from high profile users display while they were being reviewed. On average, cross-check reviews take more than five days. In the longest case that Meta shared with the board, it took seven months.

“This means that, due to cross-examination, content that is determined to violate Meta’s rules will be retained on Facebook and Instagram when it is most viral and potentially harmful,” the panel said. write.

One notorious example the panel cited was a video posted by Neymar in 2019 that included photos of a woman who had accused him of rape. The video was viewed 56 million times on Facebook and Instagram before it was removed. The Wall Street Journal report.

The panel blamed insufficient resources to review flagged posts in the cross-check program, as well as Meta’s failure to act urgently on “high severity” violations. latent.

“In the case of Neymar, it is difficult to understand how a non-consensual intimate image posted on an account with more than 100 million followers was not brought to the fore for quick, acute review. high if any prioritization system has been put in place,” the report said.

“Due to the serious nature of the policy violation and the impact on victims, this case shows that Meta needs to adopt different approaches to pending content and shorten the time it takes. review,” it said.

Furthermore, the board found that Meta had failed to apply its usual rules to Neymar following the incident.

“The company ultimately revealed that the only consequence would be removal of the content, and the usual penalty would be account disabling,” the report said. It noted that in December 2021, Meta announced an exclusive streaming deal with the soccer star.

Board of Directors calls for more transparency

Ultimately, the board said, Meta does not track metrics that indicate whether the additional layers of review provided by the cross-checking program lead to more accurate calls as to whether a post is legitimate. should continue or not with the company’s usual enforcement process.

“The meta does not consider whether the final decisions are correct,” the report said.

The panel made 32 recommendations on how Meta should improve the program to address the flaws it found and align with its goal of protecting human rights.

That includes publicizing the criteria for inclusion in the program, allowing anyone to apply for the program, and publicly labeling certain accounts, including government officials, political candidates and opponents. business cooperation.

The company should also remove accounts from cross-checking if they repeatedly break the rules and devote more resources to reviewing content.

“Meta has a responsibility to address its content moderation challenges in a way that benefits all users, not just a select few,” the report said.

Meta says that over the past year it has improved its cross-checking system, including developing “standardized” governance principles and criteria, limiting the number of employees who can add users to the program, and Create a process to review and delete users.


News7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button