Facebook beefs up tools to keep group discussions civil

Facebook beefs up tools to keep group discussions civil

New features will make life easier for moderators to keep online conversations healthy.

Facebook is aiming to reduce disharmony as people with differing viewpoints come together. (Rawpixel pic)

Facebook has beefed up automated tools to assist group moderators in keeping exchanges civil.

The social network, hammered by critics over vitriolic content in news feeds, has played up groups as enclaves where people with differing views can bond based on shared interests.

“Some of these groups are millions of people,” Facebook engineering vice-president Tom Alison said.

More than 1.8 billion people use groups every month and there are more than 70 million administrators working to “keep conversations healthy” in the forums, according to Alison.

While much group content is not public, Facebook has been looking at ways to control hateful and abusive elements in these forums.

Automated systems at Facebook already check for posts in groups that violate the social network’s rules about what content is acceptable.

A new “Admin Assist” feature lets moderators set criteria for what is considered acceptable in the group, and then have posts or comments automatically checked for violations.

Moderators can also use software to eliminate comments with links to unwanted content, slow down heated conversations, or require that people be members of the group for a certain period before being able to join conversations.

“What these tools do is automate things admins did manually, but not expose anything they didn’t have access to before,” Alison said.

More than 1.8 billion people use groups on Facebook every month. (Pixabay pic)

Facebook is also testing artificial intelligence (AI) that watches for indications of conversations getting nasty before alerting moderators.

“The AI looks at things associated with threads that have conflicts,” Alison said.

“Some admins welcome people having debates; others don’t want contentious conversations.”

A group called Dads With Daughters is among those given early access to test the tools. The online community for fathers to share advice and resources on raising girls has more than 127,000 members.

The new tools have reduced the number of moderators needed in the group, according to moderator Brian Anderson.

Hard stands by moderators against “toxic masculine tropes”, such as toting shotguns to protect daughters from suitors, was credited with helping establish a healthier tone within the community.

“You can tell the groups that are really putting in the effort to keep it a civil space, versus posts that just get nasty right away,” Anderson added.

Stay current - Follow FMT on WhatsApp, Google news and Telegram

Subscribe to our newsletter and get news delivered to your mailbox.