Images posted within the last 48 hours will appear as broken. This is expected and intended.
Yesterday 2023-08-27 a community on the lemmy.world instance received multiple posts containing CSAM (or as it is more commonly known CP) content, which spread throughout the federation. We also ended up becoming involuntary hosts of said content.
Due to the severely limited nature of the Lemmy moderation tools, removing or purging the incriminated posts from the admin UI wasn't sufficient and didn't cause the images to be actually removed from our server. Because of this, a nuclear option was required. I have deleted every image saved by our server during the last 48 hours.
Unfortunately this also includes a post on !pcm@lemmy.basedcount.com , as well as multiple posts on !returntomonke@lemmy.basedcount.com. Authors of the affected posts can fix them by re-uploading their images, without the need to recreate the posts.
We are sorry for the inconvenience, but hosting CSAM content is highly illegal and we simply can't take any risks on this front.
I am currently discussing with the other admins whether further measures are necessary, to prevent this from happening in the future. We'll keep you posted if we have any updates.
EDIT [2023-08-28 10:00 UTC]:
The attack is still ongoing. I have now blocked the community and further deleted the last 15 minutes of images.
That ban has been in place since about as long as smartphones have been a thing. And even regardless of the national ban, generally school forbade them as a local policy.
Doesn't change the fact that nothing will happen if the teacher doesn't act. So TL;DR nothing will change, it will remain up to the teacher like it already was.