Facebook To Take Action Against Users Who Share Misinformation Repeatedly

by Dikhyaa Mohanty

Facebook will take strong action against users who repeatedly share misinformation on the social media platform.

According to the Reuter report, the social media platform will take some serious action against the users who have repeatedly shares misinformation on Facebook.

In a blog post, Facebook stated that it will reduce the distribution of all posts in its news feed from a user account if it is found to be frequently sharing content that has been flagged as false by one of the company’s fact-checking partners.

The report further stated that the social media platform will further be introducing ways to inform people if they are interacting with content that has been rated by a fact-checker.

Various social media platforms have been used to spread false news and claims. It multiplied several times since the outbreak of the COVID-19 pandemic last year. False claims, conspiracies, etc. have been floated on platforms like Facebook and Twitter.

The statement by Facebook also stated that whether it is false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, the social media platform will making sure fewer people see misinformation on their apps.

Between October and December 2020, the social media platform took down 1.3 billion fake accounts ahead of an inspection by the US House Committee on Energy and Commerce into how technology platforms are tackling misinformation.

Image Source – Google

Do you find this post useful?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

You may also like

Leave a Comment