Zuckerbergs mission of ‘‘bringing us all together as a global community’’ is laudable but it invites trouble. It is hard to keep up with the stream of scandals, big and small, involving social networks such as Facebook and Twitter. From unwittingly aiding Russian efforts to subvert elections to finding themselves exploited by extremists and pornographers, they are constantly in trouble.
The latest is YouTube failing to stop videos of children being commented on by paedophiles, while letting advertisements appear alongside them. Only months after Alphabet’s video platform faced an advertiser boycott over extremist videos and had to apologise humbly, companies such as Diageo and Mars are again removing ads.
Each scandal produces fresh calls for networks to be treated like publishers of news, who are responsible for everything that appears under their names. Each one forces them further to tighten their ‘‘community standards’’ and hire more content checkers. By next year, Facebook intends to employ 20,000 people in ‘‘community operations’’, its censorship division.
Trying hard to be good. Mark Zuckerberg at Facebook’s Social Good forum in New York this week. PHOTO: AP
Tempting as it is for publications that have lost much of their digital advertising to internet giants to believe they should be treated as exact equivalents, it is flawed: Facebook is not just a newspaper with 2.1 billion readers. But being a platform does not absolve them of responsibility. The opposite, in fact — it makes their burden heavier.
A better way to think of Russian political ads, extremist videos, fake news and all the rest is as the polluters of common resources, albeit ones that are privately owned. The term for this is the tragedy of the commons. Open ecosystems that are openly shared by entire communities tend to get despoiled.
Garrett Hardin, the US ecologist and philosopher who coined the phrase in 1968, warned that ‘‘the inherent logic of the commons remorselessly generates tragedy’’, adding gloomily that: ‘‘Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons.’’
His prime example was the overgrazing of common land, when the number of farmers and shepherds seeking to use the resource of free feed for animals becomes too high. He also cited companies polluting the environment with sewage, chemical and other waste rather than cleaning up their own mess. Rational selfinterest led to the commons becoming barren or dirty.
Here lies the threat to social networks. They set themselves up as commons, offering open access to hundreds of millions to publish ‘‘usergenerated content’’ and share photos with others. That in turn produced a network effect: people needed to use Facebook or others to communicate.
But they attract bad actors as well — people and organisations who exploit free resources for money or perverted motives. These are polluters of the digital commons and with them come over-grazers: people guilty of lesser sins such as shouting loudly to gain attention or attacking others.
As Hardin noted, this is inevitable. The digital commons fosters great communal benefits that go beyond being a publisher in the traditional sense. The fact that YouTube is open and free allows all kinds of creativity to flourish in ways that are not enabled by the entertainment industry. The tragedy is that it also empowers pornographers and propagandists for terror.
So when Mark Zuckerberg, Facebook’s founder, denounced Russia’s fake news factory — ‘‘What they did is wrong and we’re not going to stand for it’’ — he sounded like the police chief in Casablanca who professes to be shocked that gambling is going on in a casino. Mr Zuckerberg’s mission of ‘‘bringing us all together as a global community’’ is laudable but it invites trouble.
Hardin was a pessimist about commons, arguing that there was no technical solution and that the only remedy was ‘‘mutual coercion, mutually agreed upon by the majority’’. The equivalent for Facebook, Twitter and YouTube would be to become much more like publishers, imposing tight rules about entry and behaviour rather than their current openness.
They resist this partly because it would bring stricter legal liability and partly because they want to remain as commons.
But every time a scandal occurs, they have to reinforce their editorial defences and come closer to the kind of content monitoring that would change their nature.
It would cross the dividing line if they reviewed everything before allowing it to be published, rather than removing offensive material when alerted. Defying Hardin, they aspire to a technical solution: using artificial intelligence to identify copyright infringements and worse before their users or other organisations flag them for review.
More than 75 per cent of extremist videos taken down by YouTube are identified by algorithms, while Facebook now finds automatically 99 per cent of the ISIS and al-Qaeda material it removes. It is like having an automated fence around a territory to sort exploiters from legitimate entrants.
Machines cannot solve everything, though. If they could exclude all miscreants, the commons would turn into something else. The vision of an unfettered community is alluring but utopias are always vulnerable.
Facebook’s common good
网友评论