The Wall Street Journal just published a series of scathing articles on Facebook

288
3
The Wall Street Journal just published a series of scathing articles on Facebook

New York CNN Business This week the Wall Street Journal released a series of scathing articles about Facebook, citing leaked internal documents that detail in remarkably frank terms how the company is not only well aware of its platforms' negative effects on users but also how it has repeatedly failed to address them.

There's a lot to unpack from the Journal's investigation. But the one thing that stands out is just how systematic Facebook's problems are documented, using the kind of simple observational prose not often found in internal communications at multinational corporations.

Here are some of the more jaw-dropping moments in the Series of Journals.

In the Journal's report of Instagram's impact on teens it cites Facebook's own research slide deck stating the app harms mental health.

That makes body image problems worse for one in threeteen girls, according to the WSJ.

Another ponder: Teens blame Instagram for an increase in the rate of anxiety and depression This reaction was unprompted and consistent across all groups Those slides are particularly significant because Facebook has often referenced its own research studies, rather than external ones, in which it notes that there's little correlation between social media use and depression.

Newton said that Facebook's internal research demonstrated the company's commitment to understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues. The review was not actually doing what we promise we do publically, according to the paper. Unlike the rest of our community, these people — those on the whitelist — can violate our standards with any consequences. Viva email: Andy Stone spokesman said to the Journal that criticism of the practice was fair, but that it was designed for an important reason: create an additional step so that we can accurately enforce policies on content that could require more understanding. A team of data scientists put it bluntly: Misinformation, toxicity and violent content are inordinately prevalent among reshares, they said, according to the Journal's report.

Our approach has had important side effects on important slices of public content, such as politics and news, the scientists wrote. This is an increasing liability, one of them quoted in a later memo cited by the WSJ.

The following year, the problem persisted. One Facebook data scientist, according to the WSJ, wrote in an internal memo in 2019: While the FB platform gives people the opportunity to connect, share and engage, an unfortunate side effect is that viral content can become viral, often before we can catch it and mitigate its effects. Lars Backstrom, vice-president of engineering at Facebook, told The Journal in an interview that like any optimization there is going to be some ways where it gets exploited or taken advantage of. That's why we have an integrity team tying those down and figure out how to mitigate them as efficiently as possible.