While welcoming voluntary CSAM scanning, scientists warn that some aspects of the revised bill “still bring high risks to society without clear benefits for children.”

  • Kissaki@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 hours ago

    I wonder how this will work. If they use third party services under GDPR they’ll have to disclose that data may or will be shared. So at least in that case it should be obvious that end-to-end is not a thing but only one channel of transfer.

    If they scan themselves, I would assume it’s also necessary to disclose under GDPR as working with and through personal data.

    The Council also wants to make permanent a currently temporary measure that allows companies to – voluntarily – scan their services for child sexual abuse. At present, providers of messaging services, for instance, may voluntarily check content shared on their platforms for online child sexual abuse material, and report and remove it. This is allowed thanks to an exemption from certain rules specific to the electronic communications sector. Although this exemption is due to expire on 3 April 2026, according to the Council position, it will continue to apply. [src]

    So we already had the exemption active, apparently. I just hope this exemption does not invalidate the necessity for transparency about systematic handling of data?

    On the basis of today’s agreement, the Council can start negotiations with the European Parliament with a view to agreeing on the final regulation. The European Parliament reached its position in November 2023.