Facebook Calls '60 Minutes' Whistleblower Piece 'Misleading'
Said segment impugns motivations of company in lengthy rebuttal
Lena Pietsch, director of policy communications for Facebook, fired back Sunday night (Oct. 3) against a segment of 60 Minutes revealing the identity of a company whistleblower as former product manager Frances Haugen.
“On Sunday, CBS 60 Minutes ran a segment that used select company materials to tell a misleading story about the research we do to improve our products,“ Pietsch said. “The segment also disregards the significant investments we make to keep people safe on our platform and seeks to impugn the motivations of our company.”
Also Read: Facebook Whistleblower's Identity Revealed on ‘60 Minutes’
In the piece, Hougan, who leaked reams of internal research, alleged that the company's algorithms “amplify polarizing and hateful content" for the sake of profit, a motive partly responsible for "tearing societies apart.”
Pietsch, in a statement e-mailed to Broadcasting+Cable and Multichannel News, went through the piece point by point, including that “it is not accurate that leaked internal research demonstrates Instagram is ‘toxic’ for teen girls,” one of the hot-button allegations that has gotten much currency since The Wall Street Journal first broke the story of the internal research.
Also Read: New Bill Would Make Social Media Liable for Harming Kids
She went on to lay out Facebook's rebuttal:
Multichannel Newsletter
The smarter way to stay on top of the multichannel video marketplace. Sign up below.
"To the claim presented on 60 Minutes that internal research shows the company is not doing enough to eradicate harmful content on the platform:
“We've invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. We have a strong track record of using our research — as well as external research and close collaboration with experts and organizations — to inform changes to our apps.”
"To the claim presented on 60 Minutes that the company’s desire for profit outweighs our efforts to keep the platform safe:
“The growth of people or advertisers using Facebook means nothing if our services aren't being used in ways that bring people closer together — that’s why we are investing so much in security that it impacts our bottom line. Protecting our community is more important than maximizing our profits. To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016.”
"To the claim presented on 60 Minutes that we mislead the public and our regulators:
"We stand by our public statements and are ready to answer any questions regulators may have about our work."
"To the claim presented on 60 Minutes that the industry needs regulation:
“We agree it’s time for updated internet regulations and have been calling for it ourselves for two and a half years. Every day, we make difficult decisions on where to draw lines between free expression and harmful speech, privacy, security, and other issues, and we use both our own research and research from outside experts to improve our products and policies. But we should not be making these decisions on our own which is why for years we’ve been advocating for updated regulations where democratic governments set industry standards to which we can all adhere.”
"To the claim presented on 60 Minutes that Instagram’s internal research shows harmful impacts on teens:
“We do internal research to ask hard questions and find out how we can best improve the experience for teens and we will continue doing this work to improve Instagram and all of our apps. It is not accurate that leaked internal research demonstrates Instagram is ‘toxic’ for teen girls. The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced. This research, like external research on these issues, found teens report having both positive and negative experiences with social media.”
"To the claim presented on 60 Minutes that the Meaningful Social Interactions ranking change amplified polarizing content on the platform:
"The goal of the Meaningful Social Interactions ranking change is in the name: improve people's experience by prioritizing posts that inspire interactions, particularly conversations, between family and friends — which research shows is better for people’s well-being — and deprioritizing public content. Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook existed, and that it is decreasing in other countries where Internet and Facebook use has increased. We have our role to play and will continue to make changes consistent with the goal of making people’s experience more meaningful, but blaming Facebook ignores the deeper causes of these issues — and the research."
Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.