APPLE’S FACEID COULD BE A POWERFUL TOOL FOR MASS SPYING


THIS TUESDAY APPLE unveiled a new line of phones to much fanfare, but one feature immediately fell under scrutiny: FaceID, a tool that would use facial recognition to identify individuals and unlock their phones.

Unsurprisingly, this raised major anxiety about consumer privacy: Consumers are already questioning whether FaceID could be spoofed. And it's also possible police would be able to more easily unlock phones without consent by simply holding an individual’s phone up to his or her face.
But FaceID should create fear about another form of government surveillance: mass scans to identify individuals based on face profiles. Law enforcement is rapidly increasing use of facial recognition; one in two American adults are already enrolled in a law enforcement facial recognition network, and at least one in four police departments have the capacity to run face recognition searches. Still, until now, co-opting consumer platforms hasn’t been an option. While Facebook has a powerful facial recognition system, it doesn’t maintain the operating systems that control the cameras on phones, tablets, and laptops that stare at us every day. Apple’s new system changes that. For the first time, a company will have a unified single facial recognition system built into the world's most popular devices—the hardware necessary to scan and identify faces throughout the world.
Apple doesn't currently have access to the faceprint data that it stores on iPhones. But if the government attempted to forced Apple to change its operating system at the government's behest—a tactic the FBI tried once already in the case of the locked phone of San Bernardino killer Syed Rizwan Farook—it could gain that access. And that could theoretically make Apple an irresistible target for a new type of mass surveillance order. The government could issue an order to Apple with a set of targets and instructions to scan iPhones, iPads, and Macs to search for specific targets based on FaceID, and then provide the government with those targets’ location based on the GPS data of devices that receive a match. Apple has a good record of fighting for user privacy, but there's only so much the company could do if its objections to an order were turned down by the courts. (On Wednesday Sen. Al Franken (D-Minnesota) released a letterto Apple CEO Tim Cook, asking how the company will handle the technology's security and private implications.)1
Over the last decade the government has increasingly embraced this type of mass scan method. Edward Snowden'sdisclosures revealed the existence of Upstream, a program under FISA Section 702 (set to expire in just a few months). With Upstream, the NSA scans all internet communications going into and out of the United States for surveillance targets' emails, as well as IP addresses and what the agency has called cybersignatures. And last year Reuters revealedthat Yahoo, in compliance with a government order, built custom software to scan hundreds of millions of email accounts for content that contained a digital signature used by surveillance targets.
To many these mass scans are unconstitutional and unlawful, but that has not stopped the government from pursing them. Nor have those concerns prevented the secretive FISA Court from approving the government’s requests, all too often with the public totally unaware that mass scans continue to sift through millions of Americans’ private communications.
Until now text has been the focus of mass scan surveillance, but Apple and FaceID could change that. By generating millions of face prints while simultaneously controlling the cameras that can scan and identify them, Apple might soon face a government order to turn its new unlocking system into the killer app for mass surveillance.
What should Apple—and the rest of us—do to respond to this risk? First, Apple should take every step possible to insulate itself from an overly broad government order to conduct mass scans for faces. It's important Apple hold to its commitment that face prints developed through FaceID are stored only locally on devices, and be fully encrypted with a key that even Apple doesn't possess.
However, the unresolved fight between Apple and the FBI over encryption makes this an unreliable remedy. Similar to the government's desired method in that case, it might, in theory, order that Apple surreptitiously override its operating system, but this time in a manner that generates duplicates of its FaceID and routes them to the company and the government. Another concern: If iPhone users become accustomed to holding their phone up for face scans to unlock their phone, those consumers could be more vulnerable to other facial-recognition systems with fewer security and privacy protections.
Therefore, Apple should also update its Transparency Reports to include data on whether it receives orders to turn over facial recognition profiles, or to conduct facial recognition scans, leaving a so-called warrant canary to serve as an alarm bell if it receives a troubling order related to FaceID in the future.
Finally, and more broadly, the public should demand that Congress rein in government’s ever-growing affinity for mass scan surveillance. Limiting or outlawing the controversial Upstream program when the authority it’s based on expires this December would be an excellent start, but facial recognition scans may soon be as big a component of mass surveillance, and the public need to be ready.
1 Clarification appended 9/15/17, 2:30 pm EDT: This piece has been edited to clarify Apple’s plan to store FaceID information locally and how a surveillance order might work.
WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.
Comment what you think and please share this story on Facebook

MGID

Loading...
Con tecnología de Blogger.