On Monday, Apple responded to privacy concerns regarding child abuse detection features.
Apple rolled out some new safety features on iOS that aim to provide more security for young users. Nevertheless, one of those tools has caught instant heat as it scans every photo for evidence of child abuse when a user uploads it to iCloud.
Any instance of child abuse found, and Apple automatically reports the user to the police. The company added the feature as it was concerned about breach of privacy.
Electronic Frontier Foundation (EFF) published a post slamming Apple, claiming that the company now has a backdoor into the private lives of users.
“Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it," EFF said in the post. "But that choice will come at a high price for overall user privacy."
Moreover, EFF showed disappointment by Apple’s plans, asserting that it's an understatement.
"It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children," it noted.
"As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”
In response, Apple has sent a letter internally that defends the new tools:
"Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR," the statement read."What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple’s deep commitment to user privacy."