The social network will start displaying un pop-up which occurs every time a user shares a link to content about COVID-19.
The notification will include a link to Facebook’s coronavirus information center and include details about the age of the article and when it was first distributed. Facebook released a similar update last month that gives similar warnings when users share articles from older news.
Facebook thus hopes to slow the spread of outdated or less credible information. “The notification will help people understand how recent the news is, but also the source of the content before they share it,” Facebook wrote in an update.
Facebook has struggled to prevent viral misinformation about the pandemic. The company revealed that it removed 7 million posts about coronavirus and added fact-checking tags to 98 million posts.
Facebook’s effort to combat misinformation
No less than 7 million posts related to Covid-19 infections have been deleted by Facebook from the largest social network in the world. This effort took place between April and June. In addition to the posts removed from the platform, another approximately 98 million posts were classified as fake. Because they were not necessarily dangerous, the latter received only an additional tag that drew the reader’s attention to take information on this subject from elsewhere.
The statistics were provided along with the new set of standards available on Facebook, available here. Deleting posts is explained there. As a reference, in previous years, US company officials avoided including the peculiarities of misinformation in their reports, but in the case of coronavirus they made an exception, because it represents an “imminent danger”.