MENLO PARK (California) • Facebook said last Wednesday it would take “stronger” action against people who repeatedly share misinformation on the platform. Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false by one of the company’s fact-checking partners, the social media giant said in a blog post. The company already does this for Pages and Groups that post misinformation, but it had not previously extended the same policy to individual users. Facebook declined to specify how many times a user’s posts have to be flagged before the new punishment kicks in. The California-based company will also start showing users a pop-up message if they click to “like” a page that routinely shares misinformation, alerting them that fact-checkers have previously flagged that page’s posts. “This will help people make an informed decision about whether they want to follow the Page,” the company said. False claims and conspiracies have proliferated on social media platforms, including Facebook and Twitter, amid the coronavirus pandemic. “Whether it’s false or misleading content about Covid-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see…
Read More










