If your Facebook News Feed is devoid of variety and diverse views, most of the blame lies with you and not with the Facebook algorithm, says a new study by data scientists at Facebook.
The researchers analysed the accounts of 10 million users over six months to reach the conclusion that the so-called "echo-chamber" isn't as impermeable as thought to be.
It said that liberals and conservatives are regularly exposed to at least some content that doesn't conform to their political or religious views, adding that almost 29 percent of the stories displayed by Facebook's news feed present views that conflict with an individual's ideology.
"You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that's not the case here," Eytan Bakshy, a data scientist at Facebook who led the study, was quoted as saying by NYT.
The researchers found individuals' choices about which stories to click on had a larger effect than Facebook's filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.
Also Read
Facebook's algorithm serves users stories based in part on the content they have clicked in the past.
The researchers found that people's networks of friends and the stories they see are skewed toward their ideological preferences.
But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.
On average, about 23 percent of users' friends are of an opposing political affiliation, according to the study.
However, some observers argued that the Facebook study was flawed because of sampling problems and interpretation issues.
The study appeared in the journal Science.