Facebook’s Fear Machine: How The Social Network Enables Racist News

Facebook’s Fear Machine: How The Social Network Enables Racist News

And by giving preference to the posts that get the most reactions from users, Facebook could amplify fear and hate, a reality that contradicts comments that Zuckerberg made last year. The company measures interactions triggered by articles on social media and lists the most successful articles every day. The number of interactions with such posts was higher in this past January than in the two other months that were compared. They are more often among the most successful articles on social media and also make it into the top 20 more often. Not only has the proportion of these articles in the top 20 increased, but so have users’ interactions with these articles. In January alone, fear-based and often misleading stories held the top place in the “10000 Flies” ranking, including many stories about refugee crimes, a favorite topic of right-wing sites. In January 2017, 36 percent of the articles in the top 20 of the “10000 Flies” ranking were fear-based articles. This figure rose to 38 percent in July and jumped to an astounding 50 percent in January 2018. It’s important to note that January 2018 was actually a relatively weak news month in Germany. These are the topics that were extremely successful on social media in January 2018.

Facebook, Twitter execs to appear before Congress next week
How To Get 250,000 Real Facebook Followers In Under A Month
Verizon’s Lesson That You Can’t Buy Your Privacy And What It Means For Facebook

For her Munich-based startup Crowdalyzer, she analyzes the misinformation and hate-based news that Facebook users read, comment on and share. Wanting to understand how the followers of different parties tick, Wild spent a week living in their world. To do this, she sequestered herself in Facebook groups where fake news is known to spread.

One of those pages is “Freie Medien” (in English, Free Media), where Facebook fans of Germany’s liberal-conservative political party, the Christian Democratic Union (CDU), are known to share conspiracy theories. There you’ll find posts claiming shampoo ingredients are “Trojan horses with dangerous invaders,” or that the United Nations allows food products to slowly be poisoned.

“I was living in a parallel world that filled me with hate and made me feel like there’s one simple solution to all of the big problems in this world,” she said.

And, when the experiment was over, Wild realized, “That’s a world that takes a lot of effort to get back out of.”

How we got here: Facebook’s algorithm changes

The customized newsfeed that all users now see when they open up Facebook works in a radically different way. Users are seeing more personal information from friends and family members, as well as the news they find important enough to comment on and share.

It is one of the most fundamental changes to the world’s largest social networking site since it was founded in 2004, and one that Zuckerberg said he hopes will better people’s lives.

“The research shows that when we use social media to connect with people we care about, it can be good for our well-being,” he said. “We can feel more connected and less lonely, and that correlates with long-term measures of happiness and health. On the other hand, passively reading articles or watching videos ― even if they’re entertaining or informative ― may not be as good.”

Meanwhile, Facebook is selling the measure to brands ― many of whom have seen a dramatic decrease in engagement since the changes began to roll out ― as a major quality initiative.

“We want to define what quality news looks like and give it a boost,” Facebook manager Campbell Brown promised at a mid-February publisher conference.

But if Facebook prioritizes posts from friends and relatives ― and if those people regularly share misinformed or racist views ― the new algorithm would only bolster a dangerous echo chamber.

And by giving preference to the posts that get the most reactions from users, Facebook could amplify fear and hate, a reality that contradicts comments that Zuckerberg made last year.

“It’s important that Facebook is a place where people with different views can share their ideas,” he said. “Debate is part of a healthy society. But when someone tries to silence others or attacks them based on who they are or what they believe, that hurts us all and is unacceptable. There is no place for hate in our community.”

HuffPost’s research shows how Facebook’s plan to boost quality backfired

A HuffPost Germany analysis shows that, despite Zuckerberg’s comments, content that triggers emotions like hate, anger, and fear are gaining in significance in Germany. The finding is the result of extensive research by HuffPost using data from the website “10000 Flies”.

The company measures interactions triggered by articles on social media and lists the most successful articles every day. “Interactions” refers to likes, reactions, shares and comments on Facebook, as well as tweets, retweets and likes on Twitter, although they make up a comparatively low part of the interactions.

The analysis focused on the 20 most successful posts of each respective day. January 2018 and July and January 2017 were compared in order to trace the development of the posts on a random basis for a year.

A total of 1860 posts were included in the analysis. That’s 620 posts for each month. The central question for the analysis: What proportion of the 20 most successful articles has content that plays on and amplifies emotions such as hate, anger, indignation, and fear?

One thing is certain: The posts that trigger negative emotions vary from user to user. Nevertheless, when looking at the comments below the articles, it can be said that articles covering the topics of refugees and crimes in particular trigger strong negative emotions among readers.

Emotional posts are gaining in significance

Articles that stir emotions like hate and fear…

Pin It on Pinterest

Shares
Share This