美文网首页
Inability to Clamp Down Hatred a

Inability to Clamp Down Hatred a

作者: 思观堂 | 来源:发表于2020-06-30 06:59 被阅读0次

Facebook, one of the world's largest social media companies, has been in hot water for nearly four years. After the 2016 US Presidential election that was plagued by fake news, misinformation, conspiracy theories and outside hate speech, the company, led by Mark Zuckerberg, tried to patch the platform's problems by making candid appeals and appearing on Congressional testimonies, even paying a 5 billion dollar settlement to the FTC due to the Cambridge Analytica scandal. 

Richard Drew / AP

Since Twitter, another social media giant that was a long-time popular hub for political advertisement and misleading information had decided to pull political advertisements from all parties in 2019, Facebook had came under intense scrutiny for not following the same path. Twitter also begun marking tweets that either violates policy but "remain accessible in the public's interest" or containing unfounded conspiracy information (most popularly, about coronavirus and voting) with warnings, a feature Facebook had not follow up with either. Similar as most other social media companies, Facebook generates revenue through running advertisement, and not an insignificant amount of that--estimated 400 million dollars or 0.5%, will come from politician ads. 

Mark Zuckerberg, and Facebook in general, has a coy relationship with political agitators like Trump. In November of 2019, NBC News revealed that he had a dinner with Trump and Peter Thiel, a Facebook board member and Trump's major tech donor in 2016 together in the White House in October. A few days before the secret dinner took place, Trump reelection campaign's Twitter account openly discouraged the company to ban political advertisements like Twitter has done. 

Twitter

Before this report, this dinner is not known to the public, and it caused massive worries among activists and journalists over what they have discussed. Zuckerberg was defiant when the news was revealed to the public, saying in an interview days later that "people should be able to see for themselves what politicians are saying." Zuckerberg dodged when the interviewer asked him about his dinner, saying that they discussed "a number of things that were on his mind" and rejected the notion Trump lobbied him during that dinner. 

David Ramos / Getty

Zuckerberg's denial of the meeting's lobbying influence on him ran afoul to a report by the Washington Post over an idea he floated over how political advertisements should be regulated. Speaking with anonymous sources, they claimed that Zuckerberg wanted to limit the number of ads a single candidate can run at one time, without fact-checking the veracity of each ad. They also proposed to prohibit a political candidate from posting advertisements 72 hours before the election. After the ideas were proposed to lawmakers in both the Democratic and Republican party, they were met with rejects and disapproval, largely because this still does not address the prevalence of misinformation in singular advertisements.

Peter Thiel, the other attendee at the secret dinner, is the driving force behind Facebook's "not the arbiter of truth" policy on political ads. Thiel is a political libertarian who made a speech supporting Trump in the 2016 Republican National Convention, and has donated millions of dollar to the Republican party. According to the Wall Street Journal, he repeated advised Zuckerberg to maintain the current status on political ads, where most of them will not be fact-checked. Due to recent protests, Facebook and Instagram, which is owned by Facebook, will be labelling content from elected officials that it deems newsworthy. 

Facebook

Facebook's lack of clarity in response to misinformation becomes dangerous considering that 69% of American adults use Facebook and 74% of Facebook users (51% of US adult population) use it daily. Facebook's lack of effort has resulted in open revolts inside their offices: In early June, hundreds of employees took a collective walkout by not signing up to work in solidarity with the Black Lives Matter movement and in protest against the company's refusal to be "arbiter of truth." Specifically, they were infuriated by the company's decision to not hiding and not fact-checking Trump posts that smeared George Floyd protesters. Facebook has also removed an advertisement put by the Trump campaign that contains the fearmongering and false information that a loose anarchist movement known as ANTIFA was behind coordinated riots with an upside down red scalene triangle, a symbol notoriously used by Nazis to identify political prisoners. Before removal, Trump advertisements against ANTIFA, most of which contain scant evidence, were ran across the platform over 2000 times between June 3rd and June 17th. 

The new measures that will label newsworthy political posts and content came largely from a coalition of demands that were put forth by an online campaign known as "Stop Hate for Profit." It was initially organized by anti-racism NGOs after a meeting between the leaders of these organizations and Zuckerberg failed to yield any consensus. They encouraged a significant number of commercial companies, including The North Face, Verizon, Unilever and Honda to stop putting ads on Facebook's platform before the company announces any fundamental changes to the current structure of political advertisements. Their demands include establishing a method of support for people targeted by hate, cutting revenues from purveyors of misinformation and increase safety of private groups to prevent harassment. The massive pullout campaign created a massive loss for Facebook almost immediately, with the company reporting a loss of 56 billion dollars on its market value, which equals to 8% of the company's assets. Mark Zuckerberg personally lost 7.2 billion dollars. 

DPA

The inability to respond to misinformation and harassment was obviously not just a problem for Facebook, with more social media companies intervening to regulate its speech policies. In March 2019, Facebook and Instagram announced that they will ban contents related to the terms "white nationalism" and "white separatism." According to Vice News, the specific new policy will ban "content that includes explicit praise, support, or representation of white nationalism or separatism." Youtube has also revised its policies on hate speech and harassment in announcements, along with deleting conspiracy and white nationalism channels, like that of David Icke, Richard Spencer and Stefan Molyneaux. Both stated that implicit content espousing these ideas are harder to be removed because they are difficult to detect. 

(链接:https://youtube.googleblog.com/2019/06/our-ongoing-work-to-tackle-hate.html

https://youtube.googleblog.com/2019/12/an-update-to-our-harassment-policy.html)

While Facebook finds itself in the political bind on censoring political ads that contain misinformation, it has also, along with nearly every other social media platform like YouTube, Twitter and Tiktok, becoming a hotbed for new forms of violent right-wing uprisings. Despite changes, most social media companies have yet to formally address a new far-right extremist movement known as Boogaloo. According to NBC News, this movement gained momentum when gun rights supporters and organized millitiamen began using the term "boogaloo (the name was from a 1984 movie about breakdancing)" as a secret code for an armed rebellion and second civil war, encouraging mass violence against people or government entities that tries to take their guns. This group is now tied to multiple real acts of violence, including a murder of two law enforcement officers, and the indictment of three men for terrorism-related charges for conspiring to destroy government property during a racial justice protest in Las Vegas. 

AFP / Logan Cyrus

It was until May--more than five months after anti-racism organizations have pointed out the existence of these groups, did Facebook deplatform some of the boogaloo groups. A report on this issue believed that using terms unrelated to known hate symbols has become the perfect blind spot for tech companies to detect extremism: "the use of comical-meme language permits the network to organize violence secretly behind a mirage of inside jokes and plausible deniability." In a radio interview, journalist Ben Collins sobbed as he condemned the companies' inabilities to take any meaningful action, saying that he called Facebook about Boogaloo activities but many groups remained active and contributed to fermenting chaotic situations during the George Floyd protests. 

Social media's general popularity can easily spread dangerous information not just on ethnic and racial issues, but also on the issue of science. Facebook groups, YouTube channels and Twitter personalities had long been the largest hubs for anti-science misinformation ranging from vaccines to climate change denial. 

A psychiatry study titled "Coronavirus conspiracy beliefs, mistrust, and compliance with government guidelines in England" examined online survey results from coronavirus conspiracies and its influence in government guideline enforcement in England and received pessimistic results. The study asked people on their level of skepticism, cause of the virus, potential purpose of the "deliberate attempt" and reasons for the lockdown enforced in Britain. The study indicated that 36.2% of the surveyees agrees, ranging from moderately to completely, that they don't trust the information from scientific experts. 38.4% agree on some level that the virus is "manmade", and 23.9% believes that the lockdown was enforced to "control every aspect of our lives."

Social media played a significant role in the disbelief of official explanations and government guidelines, with different questionnaires conducted between April and May in the UK. It showed a vexing trend of more people intending to believe in conspiracies about the coronavirus with hoax claims such as the virus was created in a lab. At the same time, while a greater percentage of people who watch TV believe in the "5G causes coronavirus" narrative, a shocking percentage who primarily use Facebook and YouTube do not believe this theory. There also appears to be a party division on this conspiracy: of those who believes it is true in May 2020, 39% are Conservative voters and 23% are Labour voters.

In a surprising turn, TikTok, the short video sharing app that gained significant popularity since 2018 that are largely used by people between 18-34, became the new ground for conspiracy theories as well. The first worrying trend was videos of young people dancing in music along with captions like "People that were never tested are added to coronavirus death toll" and "Google the number us6506148b2", a reference to a patent that anti-vaxxers often claim is a method of government mind control.

Since coronavirus conspiracies theories began proliferating on social media platforms, governments had launches inquiries about how to better clarify the rumors that can cause significant damage in the middle of a pandemic. Despite TikTok being one of the largest hubs for the so-called "5G causes coronavirus" conspiracy theory that resulted in vandalism of cell towers in Britain, London lawmakers did not invite executives from the company in testimonies of social media accountability in April. However, TikTok did add warning labels that informs people about certain facts on every post related to coronavirus using its algorithm.

TikTok

A more dangerous conspiracy theory that was strangely revived by teenage users on TikTok was Pizzagate. The original conspiracy was spread by conspiracy media platform Infowars and fringe far-right Internet personalities that claimed, based on Hillary Clinton emails released by Wikileaks, that a left-wing pizzaria owner's restaurant is hosting child sexual exploitation with Democratic lawmakers. Many in the original theory was forced to distance themselves and apologize to the owner after an armed man barged in and shot the restaurant ceiling (without causing injuries). Just like most modern Internet-based conspiracies, believers of Pizzagate often communicate and believe in clandestine codewords and symbols. 

One of the viral videos that was obsessed by these TikTok conspiracists was Justin Bieber's music video "Yummy," believing the graphics and lyrics of the song is coded to suggest Bieber is a survivor of elite sex abuse, because the video shows him singing and dancing in a video surrounded by older people while children play music. At the end of the video, it also shows a picture of Bieber as a child while a cake appears and disappears immediately. 

YouTube / Justin Bieber

The popularity of Pizzagate is undeniable: in recent months, TikTok posts with #Pizzagate were viewed 82 million times. At the same time, Google searches for the term also spiked, largely caused by videos of teenage influencers encouraging their viewers to "search up and read it all." In an interview with journalist Will Sommer, conspiracy theory researcher Mark Andrejevic believes that these videos have the amassing effect because they are "short, cryptic, with an aura of the sensationalism that doesn't need to be cashed out in the form of actual explanations" on a platform which encourages the "short, quick-hit format doesn't require coherence or explanation."

Researchers across fields have been researching methods to restrain the spread of these harmful content. It is already widely acknowledged in the academic field that spurious content has more public engagement than mainstream news in general. One paper, titled "Effects of Credibility Indicators on Social Media News Sharing Intent," indicated some convincing results on the effectiveness of fact-checking, and a difference of info-sharing behavior based upon party affiliations. 

The study asked 1,500 individuals of diverse backgrounds for sharing information with four separate "credibility indicators," from "fact-checkers, news media, public and AI". According to the study, Democrats intended to share 61% fewer non-true headlines with the Fact Checkers indicator, compared to 40% for Independents and 19% for Republicans, the most effective to dispute spurious information. The study concludes with limitations of practicality of this approach, saying this does not apply to all cultural contexts, and isn't used in the form like Facebook or Twitter, and the sharing does not yield actual social consequences. 

Under fire for these responses, Zuckerberg has tried to use money to once again prove he understands the problems with the platform's inaction as he donated 10 million dollars to racial justice organizations. Media journalists like Kevin Roose are not buying it and call for greater accountability on these platforms. Roose believes that a disparity between actual public opinion and perceived public opinion is very potent in social networks like Facebook. For example, data reveals the 7 in 10 of the most shared Facebook Posts on topics like Black Lives Matter are critical of it, while nearly 60% of US adults support the protests. He wrote: "in this moment of racial reckoning, these executives owe it to their employees, their users and society at large to examine the structural forces that are empowering racists on the internet, and which features of their platforms are undermining the social justice movements they claim to support."

Political science researchers like Joseph Uscinski and Adam Enders disagree with blaming social media for conspiracies and other problematic content, as they believe the evidence does not draw a causation. Conspiracy theories themselves are widely popular before the age of Internet and social media: in a 1975 survey, 80% of Americans believed in one form of Kennedy conspiracy theory or another. Most other conspiracies, such as the coronavirus conspiracy, does not have that level of universal agreement. In general, mass hysteria such as the Salem witch trials and antisemitic pogroms no longer represent a majority of almost any group, despite levels of education, ethnicity, sex and age. People's intention to believe in conspiracy theories, which have evolved during the age of social media, is a complicated issue and has no quick fix, but certainly doesn't end with the disappearance of social media apps, which actually makes fact-checking in general more accessible. 

相关文章

网友评论

      本文标题:Inability to Clamp Down Hatred a

      本文链接:https://www.haomeiwen.com/subject/sbobfktx.html