Facebook study has recently revealed
that personal choice matters more than algorithms in determining what
the users can see in the News Feed. But there is no solid proof to prove
this belief the critics say.
Facebook has been for a long time for the stand on the “Filter Bubble”. Facebook’s new algorithmically filtered news feed suggests
that it can shape the perception of the world through prioritizing
contents through these filter bubbles. But in a study recently it was
proved that it is not possible to determine what the users want to view
all the time. Just in case there is a Filter Bubble, the
company has claimed that it exists only because of the user’s choice to
view certain things and not because of Facebook’s Algorithmic filters.
Filter Bubble Facts |What does the study prove? |
⇒ The debate among social scientists in the field, have established that individual choices matter more than the algorithm. But they also argue that there aren’t any evidences in the study.
“Individual users choosing news they agree with and Facebook’s algorithm providing what those individuals already agree with is not either-or but additive. That people seek that which they agree with is a pretty well-established social-psychological trend… what’s important is the finding that [the newsfeed] algorithm exacerbates and furthers this filter bubble.”
⇒ The biggest issue is that the Facebook Filter Bubble study pretends that individuals choosing to limit their exposure to different topics is a completely separate thing from the Facebook algorithm doing so.
⇒ The Filter Bubble study makes it seem like the two are disconnected and can be compared to each other on some kind of equal basis.
“Comparing the individual choice to algorithmic suppression is like asking about the amount of trans fatty acids in french fries, a newly-added ingredient to the menu, and being told that hamburgers, which have long been on the menu, also have trans-fatty acids.”
⇒ In other words, Facebook’s algorithmic filter magnifies the already human tendency to avoid news or opinions that we don’t agree with.
⇒ In addition to the framing of the research . which tries to claim that being exposed to differing opinions isn’t necessarily a positive thing for society ,the conclusion that user choice is the big problem just doesn’t ring true.
“The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking…. there is no scenario in which user choices vs. the algorithm can be traded off, because they happen together. Users select from what the algorithm already filtered for them. It is a sequence.”
Filter Bubble |Study Conclusion |
Facebook’s attempt to argue that its
algorithm is somehow unbiased or neutral and that the big problem is
what users decide to click on and share is disingenuous. The whole
reason why some are so concerned about algorithmic filtering is that
users’ behavior is ultimately determined by that filtering. The two
processes are symbiotic, so arguing that one is worse than the other
makes no sense.
In other words, not only does the study
not actually prove what it claims to prove, but the argument that the
site is making in defense of its algorithm also isn’t supported by the
facts and in fact, can’t actually be proven by the study as it currently
exists. And as Eli Pariser points out in his piece on Medium about the
research, the study also can’t be reproduced because the only people
who are allowed access to the necessary data are researchers who work
for Facebook.
For your queries to be addressed
regarding to the content of the article, comment in the section below.
Your valuable feedback and comments are welcome. For daily updates on
technology do visit us again. You can also catch us on our Facebook, Twitter, Google+ and Pinterest for timely updates.
For More Information checkout http://techliveinfo.com
No comments:
Post a Comment