this post was submitted on 24 May 2024
9 points (80.0% liked)

Pulse of Truth

1331 readers
81 users here now

Cyber Security news and links to cyber security stories that could make you go hmmm. The content is exactly as it is consumed through RSS feeds and wont be edited (except for the occasional encoding errors).

This community is automagically fed by an instance of Dittybopper.

founded 2 years ago
MODERATORS
 

Security researchers reverse-engineered Apple's recent iOS 17.5.1 update and found that a recent bug that restored images deleted months or even years ago was caused by an iOS bug and not an issue with iCloud. [...]

top 8 comments
sorted by: hot top controversial new old
[–] [email protected] 15 points 1 year ago* (last edited 1 year ago) (3 children)

This has nothing to do with the Files app, nor does it have anything to do with re-indexing of the Photos library. This has to do with fighting CSAM. Apple has started (in this or a previous update), to scan your device (including deleted files) for anything containing nudity (search for "brasserie") and adding it to your photos library in a way that it is hidden. That way, anything that the models detect as nudity is stored in your iCloud database permanently. Apple is doing this because it allows them to screen for unknown CSAM material. Currently it can only recognize known fingerprints, but doing this allows them (and the other parties that have access to your iCloud data) to analyze unknown media.

The bug mentioned here accidentally made those visible to the user. The change visible updates the assets in the library in a way that removes the invisibility flag, hence people noticing that there are old nudes in their library that they cannot delete.

...

And speaking of deleting things, things are never really deleted. The iPhone keeps a record of messages you delete and media, inside the KnowledgeC database. This is often used for forensic purposes. Apple is migrating this to the Biome database, which has the benefit of being synchronized to iCloud. It is used to feed Siri with information, among other things. Anything you type into your devices, or fingerprints of anything you view are sent to Apple's servers and saved. Spooky, if you ask me. But the only way we can have useful digital assistants is when they have access to everything, that's just how it works.

Nudes are meant to persist on iPhone. You're just not meant to notice.

[–] [email protected] 4 points 1 year ago (1 children)

I don’t see these quotes in the article

[–] [email protected] 1 points 1 year ago

See comment section

[–] [email protected] 1 points 1 year ago (2 children)

It's not something the user was supposed to see, and it is not something to be concerned about, because it is done for protection.

[–] [email protected] 9 points 1 year ago

This goes way beyond fingerprinting for CSAM detection. local device hidden nudes is now a target for hackers.

[–] [email protected] 4 points 1 year ago

I don’t see these quotes in the article

[–] [email protected] 1 points 1 year ago

And remember: this is the phone that screams about privacy.

[–] [email protected] 2 points 1 year ago

Why is it so hard to just implement a simple delayed Delete function?