Facebook Inc.’s Oversight Board said it is reviewing the company’s practice of holding high-profile users to separate sets of rules, citing apparent inconsistencies in the way the social-media giant makes decisions.
The inquiry follows an investigation by The Wall Street Journal into the system, known internally as “cross-check” or “XCheck.” The Oversight Board, an outside body that Facebook created to ensure the accountability of the company’s enforcement systems, said it has reached out to the company and expects a briefing...
Facebook Inc.’s Oversight Board said it is reviewing the company’s practice of holding high-profile users to separate sets of rules, citing apparent inconsistencies in the way the social-media giant makes decisions.
The inquiry follows an investigation by The Wall Street Journal into the system, known internally as “cross-check” or “XCheck.” The Oversight Board, an outside body that Facebook created to ensure the accountability of the company’s enforcement systems, said it has reached out to the company and expects a briefing in coming days.
The XCheck program was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. It grew to include millions of accounts, according to documents viewed by the Journal. In addition, some users are “whitelisted,” meaning they were rendered immune from enforcement actions, the documents showed.
A 2019 internal Facebook review found that the practice of whitelisting was “not publicly defensible.”
The company had previously told the Oversight Board in writing that its system for high-profile users was only used in “a small number of decisions.”
In a blog post on Tuesday, the board said it was looking into whether Facebook has “been fully forthcoming in its responses in relation to cross-check, including the practice of whitelisting.”
The post continued: “This information came to light due to the reporting of the Wall Street Journal, and we are grateful to the efforts of journalists who have shed greater light on issues that are relevant to the Board’s mission. These disclosures have drawn renewed attention to the seemingly inconsistent way that the company makes decisions, and why greater transparency and independent oversight of Facebook matter so much for users.”
A Facebook spokesman has previously said that criticism of how it executed the system was fair, but added that it was designed “for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.” He said the company is phasing out the practice of whitelisting.
A Look Inside the Facebook Files
Our reporters discussed their findings from the WSJ’s Facebook Files investigation during a live Q&A on Monday.
A Facebook spokesperson on Tuesday declined to comment further on the topic.
The details about the XCheck program and whitelisting were part of a series of articles published in the Journal last week detailing how Facebook’s platforms have negative effects on teen mental health; its algorithm fosters discord; and that drug cartels and human traffickers use its services openly.
In response, Facebook vice president of global affairs Nick Clegg on Saturday published a blog post saying the articles “have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees.”
The post didn’t cite any factual inaccuracies.
In a separate post on Tuesday, the company highlighted that it now has 40,000 employees working on safety and security, and that it invested more than $13 billion in these areas since 2016.
“How technology companies grapple with complex issues is being heavily scrutinized, and often, without important context,” said the unsigned post. “What is getting lost in this discussion is some of the important progress we’ve made as a company and the positive impact that it is having across many key areas.”
Separately, Sen. Richard Blumenthal (D., Conn.) said Tuesday that lawmakers are seeking a high-ranking representative of Facebook to testify at a Sept. 30 hearing, in part to respond to the Journal’s reporting on the company’s internal research about the effects of Instagram on teen girls.
“The simple fact of the matter is that Facebook has known for years that Instagram is directly involved in an increase in eating disorders, mental-health issues, and suicidal thoughts, especially for teenage girls,” Mr. Blumenthal said at a Senate Judiciary Committee hearing on privacy and antitrust issues, adding that he felt Facebook had misled Congress in previous statements about the impact of its platforms on mental health.
Steve Satterfield, a Facebook privacy and public-policy vice president, disagreed with Mr. Blumenthal’s characterization of the company’s statements to Congress and said it would “follow up promptly” about the request for testimony. Facebook understands “the frustration and concern that we are hearing about these reports,” he said. “The safety and well-being of teens on our platform is a top priority for the company,” he added. “This was important research.”
The criticism of Facebook at Tuesday’s hearing was bipartisan. Sen. Mike Lee (R., Utah) said he felt the Journal’s reporting showed Facebook lacked competition.
“This too looks like the behavior of a monopolist, a monopolist that is so sure that its customers have nowhere else to go that it expresses a reckless disregard for quality assurance, for its own brand image, and even just being honest with its users about the obvious safety risks,” Mr. Lee said.
Mr. Satterfield said Facebook faces intense competition.
The Oversight Board, in its blog post, said it planned to release details on what it heard from Facebook in October as part of its quarterly transparency report.
“The choices made by companies like Facebook have real-world consequences for the freedom of expression and human rights of billions of people across the world,” the group said. “By having clear rules and enforcing them consistently, platforms can give users the confidence that they’ll be treated fairly. Ultimately, that benefits everyone.”
The Oversight Board has previously requested information on the company’s XCheck program—asking it to explain how the system works and to share the criteria for adding pages and accounts to the system and its error rates.
In its response, Facebook provided an explanation but didn’t elaborate on criteria for adding pages and accounts to the system, and declined to provide reporting on error rates.
Write to Ryan Tracy at ryan.tracy@wsj.com
"board" - Google News
September 22, 2021 at 04:31AM
https://ift.tt/3lIzH1j
Facebook Oversight Board Launches Review of Company’s XCheck System - The Wall Street Journal
"board" - Google News
https://ift.tt/2KWL1EQ
https://ift.tt/2YrjQdq
Bagikan Berita Ini
0 Response to "Facebook Oversight Board Launches Review of Company’s XCheck System - The Wall Street Journal"
Post a Comment