The 1:1 matching and the porn detection were separate capabilities.
Porn detection is called Communication Safety, and it only warms the user. If it’s set up in Screen Time as a child’s device, someone has to enter the parent’s Screen Time passcode to bypass the warning. That’s it. It’s entirely local to the device. The parent isn’t notified or shown the image, and Apple doesn’t get the image. It’s using an ML model, so it can have false positives.
CSAM detection was exact 1:1 matching using a privacy-preserving hashing system. It prevented users uploading known CSAM to iCloud, and that’s it. Apple couldn’t tell if there was a match or find out the hashes of images being evaluated.
Many people misunderstood and conflated the two capabilities, and often claimed without evidence that they did things that they were designed never to do. Apple abandoned the CSAM detection capability.
Passkeys are a replacement for passwords. Passwords don’t solve the problem of a lost password, and passkeys don’t solve the problem of a lost passkey. How a site deals with lost credentials is up to them. It doesn’t need to be password + 2FA.