While welcoming voluntary CSAM scanning, scientists warn that some aspects of the revised bill “still bring high risks to society without clear benefits for children.”
While welcoming voluntary CSAM scanning, scientists warn that some aspects of the revised bill “still bring high risks to society without clear benefits for children.”
Maybe you have a messenger within your company. Only employees get access. There’s no risks of minors being involved. If chats are private, account association would still be present if one of the two communicating reports issues or crimes. The threat model would be low in this case, and beyond being able to handle reported situations, you don’t need to do anything.
Lemmy is free to sign up, for anyone, and has a messaging system. So you will have to think about and assess sign-up guardrails, requirements and verification, and consequential risks involved. Then you have to think about reporting and moderation, and how you can handle those. What kind of situations and risks are involved? What does due diligence in terms of preparation look like? In terms of monitoring and responding to it?
When weighing privacy against risks you may conclude “client-side scanning of images” is not warranted. Or you may deem it worth or necessary or you don’t think much and just do it because you don’t care but want to cover your bases, in terms of law or publicity. That’s the “voluntary” part.
You can use it, or you can decide not to. As long as you assessed risks and and reasonably prepared for them.