As per a new research, the algorithms used by Facebook to filter news posts have an effect on the information seen by users, but not nearly as much as the choices made by users themselves.
The researchers considered the news that users posted online for friends, noting whether it was liberal or conservative, and then determined what kind of news, posted by the users' friends, actually reached users via the site's social algorithms.
The study concluded that compared to algorithmic ranking, individuals' choices about what to consume had a stronger effect limiting exposure to ideologically cross-cutting content.
The study titled 'Exposure to ideologically diverse news and opinion on Facebook' appears in Science Express.