this post was submitted on 04 Feb 2024
70 points (98.6% liked)

Meta (slrpnk.net)

763 readers
4 users here now

Here we can discuss anything about this Lemmy instance/server itself.

Our XMPP support chat: Movim or XMPP client.

Please also refer to our Wiki

founded 3 years ago
MODERATORS
 

I don't know if you need this info, but I was pretty disturbed to see unexpected child pornography on a casual community. Thankfully it didn't take place on SLRPNK.net directly, but if anyone has any advice besides leaving the community in question, let me know. And I wanted to sound an alarm to make sure we have measures in place to guard against this.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 18 points 2 years ago (10 children)

Unfortunately I saw it as well while scrolling, and reported it. What's the motivation behind posting fucked up shit like that?

[–] [email protected] 17 points 2 years ago (9 children)

I don't know the specifics, but trolling is trolling. It's experimenting with ways of breaking things. Not only do they probably find it funny, but if this isn't handled it can kill the platform. If they saw that Lemmy.World was defederated and shut down, that would make their day.

The point is that we need basic security measures to keep Lemmy functioning. I don't think this is just an issue of moderator response times. We need posts like that to get deleted after 10 people downvote it, and we need limits on how easily new accounts can get into everyones' front page feeds.

[–] [email protected] 11 points 2 years ago (3 children)

It should be reports and limited to users with some form of track record on the platform. So posted some time earlier, has gotten X likes, account age and similar measures to make sure it is not problematic.

Downvotes are a bad measure. They are often just done by somebody disagreeing with a post, which often are not exactly a problem. Also 10 is really low, when something really takes off. On the c/meme half the posts have more then 10downvotes, but nothing is really all that bad.

[–] [email protected] 6 points 2 years ago (1 children)

The best suggestion I have seen is to have a specific report category for CSAM. If a post is reported for CSAM x number of times, the post is hidden for moderator review. If it is a false report, the mod bans the reporting accounts.

Another issue is that post links can be edited. Trolls will definitely use this feature for abuse.

[–] [email protected] 1 points 2 years ago

Ranking algorithms need to be adjusted so that if a post is removed like this, and then restored, it gets the same number of views it otherwise would have. Without that, a user-interaction driven automatic removal will get abused at scale.

load more comments (1 replies)
load more comments (6 replies)
load more comments (6 replies)