Content and community moderation on social media and online platforms can be quite a challenge. To address the matter, Discord has unveiled a new content moderation tool. In a blog post, Discord said, “Starting today, we’re introducing AutoMod, a new moderation tool built directly into Discord to help protect Communities so everyone can find belonging within a safeguarded environment.”


How will AutoMod work and what impact it will have on users?

Discord says that it believes Community moderating should feel rewarding and fulfilling, and it wants to make it easier for moderators to maintain a clean space for members by empowering them with better tools. “AutoMod is one of the things we have built alongside other safety initiatives to help keep Discord safe for all Communities,” the blog post noted. As a server grows, mods and admins might find themselves dedicating an increasing amount of time to manually keep their server clear of unwanted behaviour and content, said Discord.
AutoMod comes equipped with keyword filters that automatically detect, block, and alert moderators of messages containing harmful words or phrases before they’re posted. AutoMod also gives moderators the ability to automatically time users out if a bad actor gets flagged, so their team can handle the situation when ready. Moderators can even have users who try to post harmful words or phrases be Timed Out automatically, so they won’t be able to continue posting until they’re back.
AutoMod is available now on Windows, macOS, Linux, iOS, Android, and the web app. If you manage a Community server, you’ll find AutoMod within Server Settings. If your server isn’t a Community server yet, you can enable the features within Server Settings > Community, then AutoMod will be available to you.



Leave a Reply

Your email address will not be published.