this post was submitted on 02 Jun 2025
113 points (100.0% liked)

Technology

39582 readers
277 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
all 12 comments
sorted by: hot top controversial new old
[–] db0 40 points 1 month ago (1 children)

I wonder what kind of of csam detection they have. If they're only relying on hash matching, they've gonna get fucked from novel genai csam. This is why stuff like the fedi-safety exists which they could use as well

[–] [email protected] 19 points 1 month ago (1 children)

It seems to be unspecified "automated and manual" systems plus reports from the NCMEC https://lemm.ee/post/65739566/20890503 , which they process quite fast https://lemm.ee/post/65739566/20890630 .

[–] db0 10 points 1 month ago (1 children)

Sorry your links don't work it seems. Maybe those posts were deleted.

In any case, if their "automated" is just hash matching, it's just not going to cut it.

[–] [email protected] 10 points 1 month ago (1 children)

Just guessing what the links may have been...

Possibly my post on lemmy.world, removed due to breaking rule 2, "Only tech related news or articles"

I'll copy paste my comment from there:

In the reply to Patreon they mentioned having some automated and manual ways of removing CSAM, plus "closely working with NCMEC", but I have no idea what that means.
And these statistics of resolved reports: https://www.missingkids.org/content/dam/missingkids/pdfs/cybertiplinedata2024/2024-notifications-by-ncmec-resulting-content-removal.pdf

Total number of reports of 128 resolved on average in 1.91 days. Less than half the time spent by Amazon, Google and Microsoft (for Bing).

The other link might have been to this comment:

[–] db0 10 points 1 month ago* (last edited 1 month ago) (1 children)

"Having manual ways to remove csam" means almost nothing. All of lemmy has a "manual way to remove csam". "closely working with NCMEC" can mean they just use the cloudflare mechanism which is just hash matching. Point is, it's very easy for a malicious actor to upload csam and then report them to patreon for it, without ever reporting it to them.

[–] Redjard 4 points 1 month ago (1 children)

Can't you always attempt uploads until they bypass arbitrary filters and then report-snipe on that?
How would a content-based filter prevent this if the malicious actor simply needs to upload correspondingly more images?

I think the sad reality is that the only escape here is scale. Once you have been hit by this attack and been cleared by the 3rd parties, you'd have precedent for when this happens again and should hopefully be placed in a special bin for better treatment.
Scale means you will be fire-tested, and are more likely to receive sane treatment instead of the ai-support special.

[–] db0 4 points 1 month ago

There can be warning about someone getting caught with multiple failed attempts

[–] [email protected] 34 points 1 month ago

Screwing with payments is a go to scummy tactic these days. It's enough to make you go full commie

[–] [email protected] 22 points 1 month ago

This sucks. I use Catbox quite heavily. I'll probably buy a sub to help support them, but I hope they're able to recoup enough to stay afloat. It's been really great having access to a non-enshittified, login-free host for video files and I'd hate to lose it.