In documents, employees fretted about ‘darker, more divisive content’ and boosting far-right sites
In November 2018, the staff of Facebook’s fledgling Civic Integrity department got a look at some eye-opening internal research — presented under an image of two goats locking horns.
The report examined articles shared on Facebook from the New York Times, BuzzFeed, Fox News, and a dozen other media outlets and found that the more negative-slanting comments a story drew, the more likely Facebook’s algorithms were to promote it widely.
“Outrage gets attention,” surmised the researchers. They ruefully compared the strategy to “feeding users fast food,” an irresistibly effective tactic for hooking an audience that would surely prove harmful down the road.