Facebook Paper

Facebook study disputes the ‘echo chamber’ effect

Life
(Source: IDG)

8 May 2015

On Facebook, your US friends will see a fair number of news articles supporting the different candidates in next year’s race to the White House but it’s unlikely what they see turn up in their News Feeds doends on their personal politics.

The site’s News Feed ranking, which can control the posts users see based on their personal information and activity on the site, has produced what some have called a ‘filter bubble’ or ‘echo chamber’: a homogenous stream composed primarily of like-minded posts, leaving little room for other points of view. Questions around the diversity of content in News Feed are not going away, given that many people now get their news on Facebook.

But it may not be as bad as it seems. The results of a new study by data scientists at Facebook, published Thursday in the journal Science, says that while much of the content people see on Facebook is aligned with their own ideology, a fair amount of it does represent opposing viewpoints.

The results, Facebook says, quantify for the first time the extent to which people are exposed to ideologically diverse news and information in social media.

Nearly 30% of all the news content that people see in the News Feed cuts across ideological lines, Facebook said. This means that nearly 30% of that content is shared by users who identified themselves as conservative on Facebook but was seen by users who identified themselves as liberal, and vice versa.

Counting just the content shared by people’s friends reveals about the same percentage that cuts across ideological lines, the study said.

Facebook’s algorithm that ranks results in the News Feed is designed to surface content to users that’s aligned with what they’re interested in, based on their activity. The results of this study help to show how much Facebook users do actually engage with content that reflects different points of view.

Nearly 25% of the news articles that people click on cut across ideological lines, Facebook said.

Method and limits
For the study, Facebook researchers developed a system that identified more than 226,000 news articles shared at least 100 times during the second half of 2014. The company wanted to see how often people were exposed to stories about politics, world affairs and the economy, which Facebook designated as being either conservative or liberal content, among liberal and conservative audiences.

The study has its limitations. It only looked at Facebook users who identified themselves as conservative or liberal in some way, which is less than 1% of the company’s total user base. The study also did not look at whether the articles that were shared changed people’s political views or habits.

The effect of algorithms like Facebook’s on what people see online has generated controversy lately.

The Federal Trade Commission is looking at the issue of ‘algorithmic transparency,’ to assess the deeper incentives behind algorithms on sites like Google and Facebook, and how they affect people.

Zach Miners, IDG News Service

Read More:


Back to Top ↑

TechCentral.ie