EU backs down on CSAM scanning but Apple isn’t off the hook yet

Apple has had quite the rollercoaster ride over plans to scan devices for the presence of child sexual abuse materials (CSAM).

After announcing and then withdrawing its own plans for CSAM scanning, it appeared that it might be legally required to do it anyway. The EU has now backed down on this, but that doesn’t necessarily let Apple off the hook …

The CSAM rollercoaster

Way back in 2021, Apple announced plans to carry out CSAM scanning on devices in what it intended to be a privacy-respecting way. However, experts quickly pointed out four potential flaws in Apple’s proposed approach.

The company responded by saying that it was going to take some time to rethink its plans, and then things went very quiet.

In 2022, Apple continued to reject arguments against its plans, but said that it had decided to abandon them anyway. By 2023, the company had changed its stance to admitting that the program existed. Then in 2024, the company completed its U-turn by putting forward the very arguments it originally rejected.

Proposed EU law

In parallel to Apple’s voluntary plans, the EU was progressing legislation that would require tech giants to scan for CSAM. This law could have required the iPhone maker to implement either its original plans or the type of cloud storage scanning carried out by most other providers.

At one point, both Europe and Australia were threatening to require tech companies to break end-to-end encryption in order to scan messages. That threat was removed last year, but the EU was continuing to work on legislation that would have required scanning of data stored in the cloud and in apps.

EU backs down, but not totally

In the latest development, Euractiv reports that the EU has now accepted it had planned to go too far and has now diluted the proposed legislation.

After years of back and forth between subsequent Council presidencies, EU countries on Wednesday finally settled on a legal text that removes mandatory detection orders, opting instead to strengthen requirements on platforms to adopt mitigation measures.

However, concerns remain that a vague legal requirement to mitigate risks could still end up with companies being required to scan messages in order to comply. While such measures have so far been defeated each time they were proposed, it does seem possible that a compromise would require Apple to scan iCloud data for the presence of CSAM.

A compromise will need to be found between the EU co-legislators in order for the law to be adopted which could still take many months.

Individual European countries will also be free to enact their own tougher legislation if desired. This one isn’t over yet.

Highlighted accessories

Photo: Dan Gold/Unsplash