It's still "early access", but it's stable and should be fit for everyday use.
I'd be really happy to get some feedback on what kind of features mods would like to see.
If you want to try it in action, go to [email protected]. That's the testing community where it currently filters posts with duplicate URLs, same as mentions of Reddit, Lego and other beings-who-must-not-be-named. Feel free to post stuff there and see it get automatically moderated.
The Github link has a detailled description of what it can do.
Basically, there are triggers and actions.
Currently there are these triggers:
Find posts with duplicate URLs
Find posts/comments that match a regex pattern. Posts can match on URL, title and content, comments only on comment. This trigger can also be used to find posts/comments that don't match the regex.
Actions:
Post a comment under the post/comment
Lock a post
Remove a post/comment. For this action you can also post a reason.
The actions that create text (posting a comment and the reason for removal) allow you to fill in any information that the REST API returns for the target post/comment. This allows you to e.g. address the user by name or link to the already existing post of the duplicate url.
I just had the stupid idea that we could try to create something like r/place. Basically people could comment coordinates/color, x,y,#HEX, and the bot would need to validate that syntax and delete comments that fail validation as well as comments from posters who previously made a comment.
The magic of turning this into an image would have to happen elsewhere though.
Hah, that wasn't meant as a mission. Just an idea I had watching the fun over on the other side. And I am wondering if Feddit as a host might too low-bandwidth to do it anyway.