Tech

Meta Oversight Council calls for changes to VIP moderation system


Meta Oversight Council calls for changes to VIP moderation system

Facebook’s Meta logo sign is seen at the company’s headquarters in Menlo Park, California, on October 28, 2021. The quasi-independent oversight board of parent company Facebook Meta said Tuesday, Dev. On December 6, 2022, the internal system that exempted prominent users, including former President Donald Trump, from some or all of that system’s content rules, needed an overhaul. big tu. Credit: AP Photo / Tony Avelar, Profile

The quasi-independent oversight committee of Meta, Facebook’s parent company, said on Tuesday that an internal system exempts prominent users, including former US President Donald Trump, from some or all of its content moderation rules need a major overhaul.

The Supervisory Board’s report, which has been in place for more than a year, says the system “has shortcomings in key areas that Company have to slove.”

Meta asked the board to review the system after the Wall Street Journal reported last year that it was being abused by many elite users, who posted material that would result in penalties for users. normal, including harassment and incitement to violence.

According to the Journal article, Facebook’s rules don’t appear to apply to some VIP users while others face reviews for posts that violate the rules, according to the Journal article. , says the system has at least 5.8 million exempt users as of 2020.

The system — known as “XCheck” or cross-checking — was revealed in leaked Facebook documents by Frances Haugen, a former product manager turned whistleblower who drew attention worldwide with disclosures alleging that the social media company prioritized profits over online safety and zinc regulators in cracking down on hate speech and misinformation.

Nick Clegg, Meta’s president of global affairs, tweeted that the company had requested a review of the system “so we can continue our work to improve the program.”

To fully address the board’s recommendations, “we agreed to respond within 90 days,” he added.

The company says the cross-check, which applies to Facebook and Instagram, is designed to prevent “excessive policy” or mistaken removal of content that is believed to violate the platform’s rules.

The Supervisory Board’s report said the cross-checking system resulted in users being treated unfairly and led to delayed takedowns of content that broke the rules due to up to five separate checks. The average decision takes more than five days.

For content posted by US users, the average decision took 12 days and for Afghanistan and Syria 17 days. In some cases, it took longer: a piece of content waited 222 days — more than seven months — for a decision, the report said, without providing further details.

Of its 32 recommendations, the panel said Meta “should prioritize expression that is important to human rights, including expression of particular importance to the public.”

Human rights defenders, advocates of marginalized communities, officer and journalists should be given higher priority than others who are cross-checked because they are business partners, such as large companies, political partiesmusicians, celebrities and artists, the report said.

“If users included due to their commercial importance regularly post infringing content, then they will no longer benefit from special protection,” the council said.

Addressing other flaws, the board also urged Meta to remove or hide content while it was being reviewed, and said the company should “radically enhance transparency around cross-checking and how it works”, such as outlining “clear, public criteria” for who gets on the list.

The board of directors backed Facebook’s decision to ban Trump last year over concerns he incited violence that led to riots on the US Capitol. But it said the company did not mention the cross-check system in the ruling request. The company has until January 7 to decide whether to allow Trump to resume operations.

Clegg said in a blog post that Meta has made changes to cross-check, including standardizing it so it “runs in a more consistent way,” opening up the system to content from all 3 billion people. Facebook users and organizes an annual review to verify its elite list of users and entities.

After widespread criticism that it did not respond quickly and effectively to misinformation, hate speech and harmful influence campaigns, Facebook has set up an oversight board to act as the ultimate arbiter of the thorny content issues it faces. Members include the former prime minister of Denmark, the former editor-in-chief of the British newspaper The Guardian, as well as legal scholars and human rights Experts.

Some critics have previously questioned the board’s independence and said its narrow content decisions appeared to distract from broader issues within Facebook and regulatory concerns. government regulations.

© 2022 Associated Press. Copyright Registered. This material may not be published, broadcast, rewritten or redistributed without permission.

quote: Meta-Monitoring Board Urges Changes to VIP Moderation System (2022, Dec. 6) accessed Dec. 7, 2022 from https://techxplore.com/news/2022-12-meta -oversight-board-urges-vip.html

This document is the subject for the collection of authors. Other than any fair dealing for private learning or research purposes, no part may be reproduced without written permission. The content provided is for informational purposes only.

news7f

News7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button