Facebook pushed back on reports that the company was aware of the negative impact of its products, claiming that the allegations don’t tell the whole picture.
The issues of content moderation, mental health risks and misinformation are complex and defy simple policy solutions, according to a statement from Nick Clegg, Facebook’s head of global affairs, posted Saturday. He said the series of articles published by the Wall Street Journal last week is based on incomplete information about difficult subjects.
The Journal’s reporting ignited another round of outrage in Washington, especially focused on what Facebook knew about the mental health impact that its photo-sharing platform Instagram has on teen girls. Several lawmakers have pledged to investigate the company and called on Facebook to scrap plans for an Instagram product aimed at children.
“Facebook understands the significant responsibility that comes with operating a global platform,” Clegg said. “We take it seriously, and we don’t shy away from scrutiny and criticism.”
The articles detailed how Facebook’s content moderation system takes a light touch with millions of politicians and celebrities, even when they violate the platform’s user guidelines. The reporting also revealed how human traffickers, drug cartels and political leaders take advantage of the platform’s global reach and growth in developing countries.
The Journal series cites leaked documents about Facebook’s own internal research. Clegg said those studies are designed to “hold up a mirror to ourselves and ask the difficult questions about how people interact at scale with social media.” He said the Journal’s claims are based on selective quotes and don’t show the whole picture of a company trying to improve its products.
“I wish there were easy answers to these issues, and that choices we might make wouldn’t come with difficult trade-offs,” Clegg said. “That is not the world we live in.”