Apple is reviving the legal battle with a cybersecurity startup Corellium. The Cupertino tech giant has filed an appeal against a copyright case it lost against Corellium‘s virtual iOS devices in December 2020. The virtual iOS devices allow security researchers to detect vulnerabilities and report them to Apple.
The surprising appeal is filed after the company had settled the copyright lawsuit with the security firm this month for an undisclosed amount and shortly after Corellium announced its new Open Security Initiative inviting independent researchers to test Apple’s security and privacy claims related to the upcoming CSAM detection system. This move casts a shadow of doubt on Apple’s intention to let researchers audit its CSAM security and privacy claims.
Corellium accuses Apple of running away from accountability by baring researchers from testing its security claims
Reuters reports that the Cupertino tech giant filed the appeal on Tuesday, August 17, 2021, which has made Corellium Chief Executive Amanda Gorton doubt the company’s claims of transparency and accountability. She said:
“Enough is enough. Apple can’t pretend to hold itself accountable to the security research community while simultaneously trying to make that research illegal.”
As a champion of users’ privacy, Apple faces scorching criticism for its upcoming CSAM detection and Communication Safety in Messages that would push new systems to scan users’ iCloud Photos and analyze their messages for known CSAM. To assure the critics, the company’s head of Software Craig Federighi said that the new systems are not a backdoor and security research can audit and verify the systems’ privacy protections.
I really don’t understand that characterization. Well, who knows what’s being scanned for? In our case, the database is shipped on device. People can see, and it’s a single image across all countries. We ship the same software in China with the same database as we ship in America, as we ship in Europe.
If someone were to come to Apple, Apple would say no, but let’s say you aren’t confident. you don’t want to just rely on Apple saying no. You want to be sure that Apple couldn’t get away with it if we said yes. Well, that was the bar we set for ourselves in releasing this kind of system.
There are mutiple levels of auditability, most privacy-protecting way we can imagine, and verfiable way possible. So we’re making sure that you don’t have to trust any one entity or even any one country, as far as how these images are and what images are part of the process.
However, with the new move to bar a security firm from doing that, the company is sending a very wrong message. As per the report, experts are surprised by Apple’s decision.
Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.
“We’ll prevent abuse of these child safety mechanisms by relying on people bypassing our copy protection mechanisms,’ is a pretty internally incoherent argument,” tweeted David Thiel of the Stanford Internet Observatory.