美文网首页
#Facebook新闻偏见门#

#Facebook新闻偏见门#

作者: 杰罗姆 | 来源:发表于2016-05-17 08:09 被阅读60次

    Fears of Facebook Bias Seem to Be Overblown

    Focus should be on service’s news feed, rather than its trending topics

    Of the 1,500 or so posts pumped out by the average Facebook user’s friends every day, that user only looks at about 300. ENLARGE

    Of the 1,500 or so posts pumped out by the average Facebook user’s friends every day, that user only looks at about 300. PHOTO: JAAP ARRIENS/ZUMA PRESS

    By CHRISTOPHER MIMS

    Updated May 16, 2016 12:17 a.m. ET

    Two competing narratives emerged last week after an article in tech blog Gizmodo accused Facebook of suppressing conservative news in its “trending topics” feature. Both are distractions from what I believe is the real issue.

    The first narrative is based on the allegations in the Gizmodo article: that Facebook’s news curators, who select the trending-topics items, are consciously or unconsciously biased against conservative news services and topics.

    In response, Sen. John Thune asked Facebook for more details about how it picks trending topics in the blink-and-you’ll-miss-it box in the upper right-hand corner of its home page. Then, Facebook released its guidelines for picking these topics, a 28-page document seemingly designed to eliminate bias, but revealing just how much human editors shape the process.

    While Facebook has denied allegations that its "trending topics" feature is biased, the social network acknowledges it uses human curators to complement algorithms in delivering news to users. Here are some of the key factors that impact what posts you see. Photo: Getty Images

    All in, it’s an interesting story about the unexpected presence of humans in a process that Facebook had suggested was algorithmically driven, with the possibility that bias seeps in—but it hardly seems to warrant the attention it generated.

    Others argued that concern over trending topics was misplaced, because the feature occupies little real estate on Facebook’s Web service, and doesn’t initially appear on mobile devices. Thus was born the second narrative: To find whether real bias exists on Facebook, examine the news feed—that river of baby pictures, jokes, updates from your friends and occasional links to news stories where people spend the vast majority of their time on Facebook.

    Here, those people said, is where the algorithm that determines what appears in your personalized feed, and in what order, does its dastardly work. You see, of the 1,500 or so posts pumped out by the average Facebook user’s network of friends every day, that user only looks at about 300. The trick is which ones.

    Many people are concerned that Facebook has created a so-called filter bubble, in which it shows users only what they want to see, to entice them to spend more time on the network. Such concerns are heightened because little is known about Facebook’s algorithm.

    Facebook CEO Mark Zuckerberg with the social network’s news feed in 2013. An algorithm determines what is seen. ENLARGE

    Facebook CEO Mark Zuckerberg with the social network’s news feed in 2013. An algorithm determines what is seen. PHOTO: JEFF CHIU/ASSOCIATED PRESS

    That people like to see things that conform to their pre-existing notions is well known—it’s a part of what psychologists call confirmation bias.

    Claiming that Facebook is contributing to our age of hyper-partisanship by only showing us things that fit our own personal slant is, ironically, an example of confirmation bias, because the evidence around it is mixed.

    After an exhaustive search of the literature around filter bubbles, five co-authors and Frederik J. Zuiderveen Borgesius, a researcher at the a researcher at the Personalised Communication project at the University of Amsterdam, concluded concerns might be overblown. “In spite of the serious concerns voiced, at present there is no empirical evidence that warrants any strong worries about filter bubbles,” Mr. Zuiderveen Borgesius wrote in an email.

    The authors examined not only Facebook but other online services, including Google search. Mr. Zuiderveen Borgesius’s conclusion: We don’t have enough data to say whether Facebook is biasing the news its readers see, or—and this is even more important—whether it affects their views and behavior.

    Facebook’s opacity aside, where does the hand-wringing come from? Two places, I think: the first is that everyone in the media is terrified of Facebook’s power to determine whether individual stories and even entire news organizations succeed or fail. The second is an ancient fear that, by associating only with people like ourselves, and being selective in what we read, we are biasing ourselves unduly.

    Before the filter bubble, there was the so-called echo chamber.

    A search of Google’s Ngram—a service that tracks the frequency with which words or phrases appear in books—reveals that “echo chamber” first gained popularity in the late 1930s. I asked the Journal’s resident etymologist, columnist Ben Zimmer, about the earliest use of the term “echo chamber” in its modern sense. He found this, from Blackwood’s Edinburgh Magazine, published in 1840 in Scotland:

    “Since the year 1813, the interest in things German, both in this country and in France, has been steadily on the increase; foreign criticism has become now something better than an echo-chamber for the bandying about of mutual misunderstandings.”

    Facebook didn’t exist in 1840. “Mass media” meant pamphlets and thin, poorly circulated newspapers. But even then, the combination of humanity’s natural tendency to associate with like minds, and to seek voices that echo our own, was a source of consternation.

    Here’s the question we should be asking about “bias” in Facebook’s news feed: Is it substantially worse than in the heyday of newspapers and magazines, when readers in major cities could choose to get their news from among a dozen or more publications tailored to their biases, or in the age of cable news, in which enormous profit has been reaped from continuing this tradition?

    Is it possible that Facebook’s algorithm produces a news feed that might even be a less-biased news source than what came before? Facebook, after all, is simply performing the same function gossip and social stratification has accomplished since the dawn of civilization, by allowing us to filter what we hear to precisely the degree we please.

    —Follow Christopher Mims on Twitter @Mims or write to christopher.mims@wsj.com.

    相关文章

      网友评论

          本文标题:#Facebook新闻偏见门#

          本文链接:https://www.haomeiwen.com/subject/opfirttx.html