Facebook has stated that it would remove all postings from people who persistently disseminate disinformation and fraudulent material across its platforms, as part of an expansion of its fact-checking program to include persons operating Pages, Groups, Instagram accounts, and domains.
The social network said late Wednesday that the new policy applies to inaccurate or misleading materials regarding Covid-19 and vaccinations, climate change, elections, and other issues, with the goal of reducing misinformation on its family applications.
“Beginning today, we will limit the dissemination of all posts in an individual’s Facebook account’s News Feed if they share information that has been graded by one of our fact-checking partners on a consistent basis. We currently limit the reach of a single post in the News Feed if it has been disproved “Facebook kept us updated.
Currently, the firm alerts users when they post material that is subsequently rated by a fact-checker.
Facebook has now changed these messages to make them clearer when this occurs.
The message contains a link to the fact-article checker’s disputing the claim, as well as an invitation to share it with their followers.
“It also contains a warning that users who consistently publish misleading information may have their posts pushed down in the News Feed, making them less visible to other users,” the social network said.
In late 2016, the business started its fact-checking initiative.
“We’ve strengthened our enforcement against Pages, Groups, Instagram accounts, and websites that spread misinformation, and we’re now extending some of these efforts to include fines against individual Facebook accounts,” the firm added.