The Facebook Papers include tens of thousands of internal documents from social media giant Facebook, Inc. (FB) that were shared with The Wall Street Journal by a former product manager at Facebook, Frances Haugen. These documents became the basis of the Facebook Files series of articles run by the WSJ, which explored how the company’s platforms can cause harm to users and society at large. Haugen also has aired her concerns with Facebook on the CBS News investigative program 60 Minutes and filed whistleblower complaints with the U.S. Securities and Exchange Commission (SEC).
Haugen’s appearance on 60 Minutes was followed, in short order, by her testimony before the U.S. Senate subcommittee on Consumer Protection, Product Safety and Data Security on Oct. 5, 2021. She also shared key selections from these documents with Congress.
Key Takeaways
- The Facebook Papers are internal documents made public by whistleblower Frances Haugen.
- They generally cast the company as profit-driven and socially irresponsible.
- CEO Mark Zuckerberg’s public statements often differ from what these documents reveal.
Key Revelations
According to the documents themselves, researchers, people familiar with Facebook, former employees, and current employees, the social media platform is beset by a number of critical issues. These include hate speech, incitements to violence, and false news on its platform that is more widespread than the company acknowledges publicly.
The company has limited staffing and resources dedicated to identifying and trying to remove such potentially harmful content. Moreover, efforts to monitor content tend to be limited to English-speaking Western nations, largely ignoring developing countries where such posts may have greater potential to induce harm. According to the documents, just 16% of Facebook’s efforts against negative content are directed outside the U.S., partly due to the complexity of dealing with a vast array of languages and dialects around the world. Facebook’s core products actually may be assisting the spread of harmful content.
Political considerations may be limiting the company’s efforts against misinformation. For example, Facebook CEO Mark Zuckerberg personally agreed to censor dissidents in Vietnam when faced with a threat from that country’s communist government to block Facebook.
In an attempt to deflect or diminish antitrust and legislative scrutiny, Facebook deliberately issues public assertions that downplay its market dominance. Zuckerberg often makes public statements at variance with the company’s own internal findings. For example, he told Congress in 2020 that Facebook removes 94% of hate speech before a human reports it, but researchers estimate that the true figure is under 5%. Another example is that, while Zuckerberg claims that his company does not try to induce users to spend more time on its platforms, internal documents indicate the opposite.
Facebook’s XCheck, or cross check, exempts certain high-profile persons from the sorts of sanctions that might be invoked against less influential users for posting similar questionable content. While Zuckerberg insists that Facebook is a platform, not an “arbiter of truth,” the real reason may be that Facebook does not want to make enemies among persons who might be able to retaliate in a meaningful way.
While senior citizens have been Facebook’s fastest growing demographic group in the U.S. during the past decade, young adults find the company’s platforms to be overcrowded with boring, misleading, negative, or irrelevant content, while also being concerned about privacy.
Credit: Source link