"Fake news": Facebook and Google accused of negligence after the Las Vegas massacre

in #life7 years ago

"Fake news": Facebook and Google accused of negligence after the Las Vegas massacre

The two online platforms automatically relayed false information and far right propaganda after the worst mass killings in US history in Las Vegas.

For months, Facebook and Google have been multiplying ads on their fight against fake news. These beautiful intentions were swept away in a few hours. After the Las Vegas shooting, which left at least 59 people dead and more than 500 injured, the two web giants were accused of putting forward propaganda content or false information about the killing.
Google News, Google's indexing module for media content, for example, promoted discussions at the 4chan US forum, which accused a person of being responsible for the shooting. This information was false. Safety Check, Facebook's platform supposed to give information on dramatic events or disasters, has been invaded by disinformation sites, spam or even people claiming Bitcoin virtual currency donations.
Defects of algorithms

This case is very embarrassing for Google and Facebook, already accused of having allowed the spread of "fake news" (contents deliberately mistaken or propaganda) during the American presidential election. The two online giants had then made many promises and announced better tools to control the quality of content promoted on their platforms. The Las Vegas massacre nevertheless proves that they do not yet know how to fight their biggest weakness: themselves.
Facebook and Google are huge platforms, largely governed by algorithms. They automatically find the most suitable content for users in a given location. For example, Google News is supposed to post articles from reliable sources and match current news. The news feed of Facebook orders the publications of our friends and the pages we follow: it relies on many signals to guess what we would like to see first, and so on. Safety Check is a tool launched in 2015 by the social network so that people in dangerous situations can signal to their relatives that they are safe. Recently, Facebook added a platform that, in theory, included useful publications for victims, tools for collecting donations and press articles on the ongoing disaster. It was this system that was invaded by false sites, which managed to be spotted by the algorithms of the social network.
This is not the first time that Facebook has problems with Safety Check. It was previously hand-locked by social network workers. The system was then automated after many criticisms, which accused Facebook of activating this tool only for disasters in Western countries, and not elsewhere in the world. This automation has made the platform vulnerable to disinformation or spam sites. In 2016, Safety Check announced to its Thai users that a false bomb had exploded in Bangkok, due to old articles that had been mass-republished by malicious Internet users.
Platform Responsibility

Asked by US media about these new errors, Google apologized by saying that "this result [4chan] should not have appeared." Facebook said it "terribly regrets the confusion caused" by the appearance of false information on its Safety Check platform. These problems are far from recent. Platforms such as Google News or Safety Check have a massive audience. The latter is a crucial element in the economy of "fake news" sites, looking for clicks and people who will watch their ads.
The platforms built by Google and Facebook are automatic, but not out of control. Web giants make choices by shaping their algorithms. However, some are surprising, such as the fact that 4chan, a non-journalistic forum that is regularly controversial, is not excluded from Google News. "Google and Facebook reject responsibility on their algorithms, as if they did not control their own code," writes the American site The Outline. Despite the regular polemics, online platforms are still struggling to improve. Last month, Facebook was accused of allowing the purchase of advertisements aimed at people who "hate Jews". "This is what happens when you put your quality standards on algorithms," commented Walt Mossberg, co-founder of Recode and a respected figure in new technologies. The stakes are different, but the problem has not changed.

Coin Marketplace

STEEM 0.30
TRX 0.11
JST 0.031
BTC 69430.02
ETH 3814.93
USDT 1.00
SBD 3.66