On Tuesday, the Meta Platforms Oversight Committee recommended that the company improve its systems, exempting highly-configured users from the company’s rules, saying the practice prioritizes those authority and allow business interests to influence decisions about content.
The arrangement, known as cross-checking, adds an extra layer of performance evaluation to the millions Facebook and Instagram accounts of celebrities, politicians, and other influential users, allowing them more time to post content that violates company policies.
Cross-checking “prefer commercial value users meta and as the structure does not meet Meta’s human rights responsibilities and company values,” Supervisory Board Director Thomas Hughes said in a statement announcing the decision.
The board of directors has been reviewing the cross-examination program since last year, when whistleblower Frances Haugen exposed the extent of the system by leaking internal company documents to the Wall Street Journal.
Those documents reveal that the program is both larger and more forgiving of influential users than Meta had previously told the Supervisory Board, which is funded by the company through a trust fund. and operate independently.
With no controls on eligibility or governance, cross-checking is rampant to include nearly anyone with a significant online following, though even with millions of members it represents only represents a fraction of Meta’s 3.7 billion users.
In 2019, the system blocked the company’s moderator from deleting a nude photo of a woman posted by Brazilian soccer star Neymar, even though the post violated Meta’s rules about “intimacy images not allowed”. consent,” the WSJ reports.
The board of directors at the time of the report reprimanded Meta for not being “completely ready” in the cross-examination disclosures.
In comments issued on Tuesday, the panel said it had agreed that Meta needed mechanisms to address execution errors, given the unusual volume of user-generated content that the company has produced. censorship every day.
However, it added, Meta “has a responsibility to address these larger issues in a way that benefits all users and not just a select few.”
It made 32 recommendations that it said would structure the program more equitably, including transparency requirements, system impact audits and a more systematic approach to eligibility. to sue.
It says public actors should continue to be eligible for inclusion in the program, but based solely on publicly available criteria, with no other special offers.
The Supervisory Board’s policy recommendations are not binding, but Meta is required to respond to them, usually within 60 days.
A spokesman for the Supervisory Board said the company had requested and was granted an extension in this case, so it would have 90 days to respond.
© Thomson Reuters 2022