The decision comes after a group found that roughly 40 percent of claims debunked by fact-checkers remained on the social networking giant.
The move, which will start over the next three weeks, represents a major step by Facebook. | Richard Drew/AP Photo |
By MARK SCOTT
LONDON
— Millions of Facebook users will soon be told if they saw online
posts containing misinformation about the coronavirus pandemic after
the social networking giant announced Thursday its latest plans to
contain the spread of rumors, half truths and lies connected to the
public health crisis.
The
move, which will start over the next three weeks, represents a major
step by Facebook — an acknowledgment that its efforts to scrub the
platform of falsehoods related to the coronavirus have
not been sufficient to
stop millions of people sharing, liking and engaging with
misinformation.
"Through
this crisis, one of my top priorities is making sure that you see
accurate and authoritative information across all of our apps,"
Mark Zuckerberg, the company's chief executive, wrote
on his Facebook page.
The
decision, in part, comes after the campaign group Avaaz discovered
that over 40 percent of the coronavirus-related misinformation it
found on Facebook — which had already been debunked by
fact-checking organizations working with the tech giant — remained
on the platform despite the company being told by these organizations
that the social media posts were false.
In
total, Avaaz said that these fake social media posts — everything
from advice about bogus medical remedies for the virus to claims that
minority groups were less susceptible to infection — had been
shared, collectively, 1.7 million times on Facebook in six languages.
"Facebook,
given its scale, is the epicenter for misinformation," Fadi
Quran, Avaaz's campaign director, told POLITICO, adding that the
company's efforts to combat the problem had steadily improved since
the social network announced it would do all it could to stop the
spread of such life-threatening falsehoods.
Facebook
said Thursday that its existing steps, including pinning government
public health warnings to the top of people's news feeds, had led to
350 million people worldwide clicking through to authoritative
sources in search of accurate information.
"Facebook
should be proud of this step," added Quran in reference to the
company's decision to retroactively notify people they had seen
misinformation. "But the step doesn't reflect the full gamut of
what we would like to see them do."
This smacks of Big Brother: "We're doing this for your own good."
I say, "Let the fools suffer the consequences of their foolishness. It's the only way they'll learn."
No comments:
Post a Comment