THURSDAY, March 28, 2024
nationthailand

Apple unveils plans to scan phones for child pornography, sexual messages to minors

Apple unveils plans to scan phones for child pornography, sexual messages to minors

Apple unveiled a sweeping new set of software tools Thursday that will scan iPhones and other devices for child pornography and text messages with explicit content and report users suspected of storing illegal pictures on their phones to authorities.

The aggressive plan to thwart child predators and pedophiles and prohibit them from utilizing Apple's services for illegal activity pitted the tech giant against civil liberties activists and appeared to contradict some of its own long-held assertions about privacy and the way the company interacts with law enforcement.

The move also raises new questions about the nature of smartphones and who really owns the computers in their pockets. The new software will perform scans on its users' devices without their knowledge or explicit consent, and potentially put innocent users in legal jeopardy.

In a blog post on its website Thursday, Apple said there is a one-in-a-trillion chance of a person being incorrectly flagged, and it said each instance will be manually reviewed by the company before an account is shut down and authorities are alerted. Users can appeal the decision to Apple, the blog post said.

The software uses a matching technique, where photos stored on iPhones will be scanned and then compared with known child pornography. Before a photo can be uploaded to iCloud, Apple's online storage service, it will be given a "voucher" ensuring that it is not child pornography.

This kind of matching system is similar to what has been in use for years by companies like Facebook. But in those systems, photos are scanned only after they are uploaded to servers owned by companies like Facebook. In Apple's new system, photos and messages will be scanned on a user's device, a new level of surveillance on what is known as the "client-side" that raised eyebrows among civil libertarians and privacy advocates.

In a response to the announcement, online advocacy group the Electronic Frontier Foundation said it was concerned about the move because of future abuses that might occur.

"It's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children," the foundation said. "As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses."

The debate about Apple's new effort began on Twitter Wednesday evening, when security experts began tweeting about the rumored announcement.

"Regardless of what Apple's long term plans are, they've sent a very clear signal," wrote Matthew Green, an associate professor of computer science at the Johns Hopkins Information Security Institute, on Twitter. "In their (very influential) opinion, it is safe to build systems that scan users' phones for prohibited content. That's the message they're sending to governments, competing services, China, you."

Apple clashed with law enforcement when the FBI obtained a court order forcing Apple to help unlock an iPhone belonging to one of two shooters in a December 2015 attack at the San Bernardino Inland Regional Center that left more than a dozen people dead. The FBI wanted to unlock the phone so it could pursue possible leads of accomplices to the attack. But Apple refused, taking a moral stand. "The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge," Apple CEO Tim Cook wrote in a statement at the time. Now, privacy experts are accusing Apple of creating another type of potential backdoor for the kind of abuse Cook outlined in its stand against the FBI.

The new initiative won't be limited to just photos. It will also scan messages sent using Apple's iMessage service for text and photos that are inappropriate for minors. If minors receive, for instance, a photo identified as sexually explicit, it will appear to be blurred and the minor may be warned that if they click on the photo their parents will be notified.

The pushback from civil libertarians on the new initiative by Apple shows how privacy and security often have a complicated relationship. Apple's decision to scan the photos on a user's device, and not Apple's own servers, is a way of protecting user privacy. Even Apple won't be able to see what is being scanned until something is flagged as illegal. On the other hand, the worry posed by security experts is that software code Apple created, which will live on every iPhone, could be exploited by malicious entities to siphon personal data from users. In that hypothetical scenario, users would have lost their privacy anyway.

If Apple's new scanning software is able to stop sexual abuse of minors from happening on its services, there are other services to choose from available for download on Apple's App Store. In 2019, The Washington Post used a machine learning algorithm to scan publicly available App Store reviews for reports on unwanted sexual behavior on chat apps used primarily by minors. It found 1,500 reports in six apps.

Child predators often groom victims on apps like these, and then try to move the conversations to other platforms, such as Snapchat or Instagram.

In its response, the Electronic Frontier Foundation said Apple's iMessage is less secure because of the new changes. "A secure messaging system is a system where no one but the user and their intended recipients can read the messages or otherwise analyze their contents to infer what they are talking about," it wrote.

nationthailand