Facebook Will Stop Recommending Health Groups

Facebook CEO Mark Zuckerberg is applauded as he delivers the opening keynote introducing new Facebook, Messenger, WhatsApp, and Instagram privacy features at the Facebook F8 Conference at McEnery Convention Center in San Jose, California on April 30, 2019. - Got a crush on another Facebook user? The social network will …
AMY OSBORNE/Getty

The Verge reports that Facebook plans to stop recommending health-related groups to users in an attempt to slow the spread of misinformation on its platform.

In a recent article, The Verge claims that social media giant Facebook has plans to add new rules to its Groups feature aimed at stopping the spread of misinformation on its platform. One of these new rules is to no longer suggesting health-related groups to users on its platform.

The Verge reports:

Facebook is adding new rules intended to slow the spread of misinformation and other harmful content on its Groups feature. From now on, if a group is removed for violating Facebook’s policies, its members and administrators will be temporarily unable to create any new groups. If a group has no administrators, it will be archived. And Facebook won’t include any health-related groups in its recommendations.

Facebook groups have been blamed for spreading misinformation and conspiracies, particularly when Facebook’s recommendation algorithm promotes them. And the company’s new rules expand on existing efforts to police them. Admins were already barred from creating a new group similar to a banned one, for instance.

Some of the new policies encourage more active administration of groups. If administrators step down, they can invite members to take their place; if nobody does, Facebook will apparently “suggest” admin roles to members, then archive the group if that fails. Also, if group members accrue a community standards violation, moderators will have to approve all their posts for 30 days. If the moderators repeatedly approve posts that violate Facebook’s guidelines, the group could be removed.

Facebook appears to be taking a broad approach when it comes to health guidelines, stating that although groups can “be a positive space for giving and receiving support during difficult life circumstances … it’s crucial that people get their health information from authoritative sources.”

Facebook has focused on limiting the spread of anti-vaccination content in the past and more recently coronavirus misinformation. The company has done this by adding contextual information to posts that discuss the coronavirus and pacing banners on vaccination-related groups and pages.

Read more at the Verge here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com

.

Please let us know if you're having issues with commenting.