Aug 17 - Apple Inc. lost a copyright case on Tuesday against security startup Corellium, which helps researchers examine programs like Apple's new method for detecting child abuse images https: www.reuters.com technology after-criticism - Apple-only - seek-abuse - images-flagged - multiple nations 2021 - 08 - 13.
A federal judge last year https: www.reuters.com business Apple-loses copyright claims - lawsuit-against us-security - bug-startup - 2020-12-29 rejected the Apple's copy right claims against Corellium, which makes a simulated iPhone that researchers use to examine how the tightly restricted devices function.
Security experts are among Corellium's core customers and the flaws they uncovered have been reported to Apple for cash bounties and used elsewhere, including by the FBI in cracking the phone of a mass shooter who killed several people in San Bernardino, California.
Apple makes its software hard to examine, and the pre-selected research phones Apple offers specialized experts come with a host of restrictions. The company declined to comment.
The appeal came as a surprise because Corellium had already settled other claims with Apple about the Digitial Milennium Copyright Act, but avoided a trial.
Experts said they were also surprised that Apple revived a fight against a major research tool provider just after arguing that researchers would provide a check on its controversial plan to scan customer devices.
Amanda Gorton: Enough is enough, said Corellium Chief Executive Amanda Gorton. Apple can't pretend to hold itself accountable to the security research community when simultaneously trying to make that research illegal.
Under Apple's new plan published earlier this month, software will automatically check photos slated for upload from phones or computers to iCloud Online storage to see if they match digital identifiers of known child abuse images. If enough matches are found, Apple employees will look to make sure the images are illegal, then cancel the account and refer the user to law enforcement.
How I can prevent abuse of these copy protection mechanisms by relying on people bypassing my copy protection mechanisms, is a pretty internally incoherent argument, tweeted David Thiel of the Stanford Internet Observatory.
Because Apple marketed itself as devoted to user privacy and other companies only scan content after it is stored online or shared, digital rights groups have objected to the plan.
One of their main arguments has been that governments theoretically could force Apple to scan for restricted material as well, or to target a single user.
In defending the program, Apple executives said researchers could examine the list of banned images and verify what data were sent to the company in order to keep it honest about what it sought and from whom.
One executive said that such reviews made it better for privacy overall than would have been possible if the scanning occurred in Apple's storage, where it kept the coding secret.