You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My approach would be to add certain keywords to the report message (“porn”, “CSAM”, “child” in various languages, or make it a textbox where mods can enter words themselves) and hide the image by default when the message contains any of those, with a click-to-expand feature to actually verify reports.
I don’t think relying on the NSFW flag is very useful against abuse, though it could be a good feature for moderators of instances that have NSFW communities.
I think this is valid, as sometimes we don't want to see the NSFW content, or have it downloaded, when we are just viewing a list of reports.
There are a couple things that could help with this:
Setting "Display content preview for NSFW posts"
Default this option to your user-setting "blur_nsfw" or "hide_nsfw".
Stretch Goal:
Setting "Hide content preview with reports containing "
This would allow hiding content on reports that contain a matching word in the body of the users report submission.
The text was updated successfully, but these errors were encountered:
https://lemmy.tgxn.net/comment/305035
I think this is valid, as sometimes we don't want to see the NSFW content, or have it downloaded, when we are just viewing a list of reports.
There are a couple things that could help with this:
Default this option to your user-setting "blur_nsfw" or "hide_nsfw".
Stretch Goal:
This would allow hiding content on reports that contain a matching word in the body of the users report submission.
The text was updated successfully, but these errors were encountered: