Conversations can quickly spiral out of control online, so Facebook is launching a new experimental Conflict Alerts feature to help with that.

Basically, it's an AI moderator designed to keep an eye on group fighting in its forums.

Here is how it works...

If the AI detects contentious or unhealthy conversations in the group, it will alert a group administrator, who will then take action.

The administrator could limit how frequent some group members can post comments, and can determine how quickly comments can be made on individual posts.

Of course, there is concern about AI doing this work because it sometimes doesn't understand context in posts. But Facebook, which has nearly 3 billion monthly users, says the tool will use several signals from conversations before alerting administrators. The goal is to create a civil place where users can share with their friends without the fear of judge or bullied.

Along with the AI, Facebook rolled new software tools including Admin Home, a one-stop shop to manage your community. With the new experience, admins can see what needs attention across posts, members and reported comments...and find key tools they’re looking for through a reorganized layout that shows what’s available under each category.

The announcements were made on a Facebook blog post Wednesday.


All content © 2021, WALA; Mobile, AL. (A Meredith Corporation Station). All Rights Reserved.

Recommended for you

(0) comments

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.