1. 程式人生 > >Fighting AI Surveillance with Scarves and Face Paint

Fighting AI Surveillance with Scarves and Face Paint

China’s Test Bed for Surveillance Technology

Lu isn’t developing these “adversarial examples” for protesters, hackers, or spies, however. He says that the basic idea behind this research is to push the people who design neural networks to improve the algorithms in their systems.

“People in the deep learning area [are becoming] more and more concerned with the security of the networks, as the networks achieve more and more success. We cannot afford the security threat. There is no doubt that more and more people will try to attack these systems.”

By wrecking the ship, the idea is that flaws will be exposed and improvements can be made. It’s a reminder that, while trials like those recently undertaken by London’s Metropolitan Police are laughably inaccurate, the technology is evolving.

If you want a hint of where it could be in the next decade, look east. While UK and U.S. police face a gamut of technical and political barriers to developing facial-recognition systems, China is another matter.

SenseTime surveillance in action.

SenseTime, a Chinese company at the heart of the country’s AI boom, was recently valued at $3 billion, a figured fueled by the firm’s image-recognition capabilities. Among SenseTime’s customers are a platter of government-related agencies, which are able to feed the company datasets of a magnitude that would make many Western AI firms drool. Speaking to

Quartz, SenseTime CEO Xu Li gestured to a training database of more than 2 billion images: “If you have access to the government data, you have all the data from all Chinese people.”

In return, SenseTime is able to provide its Viper surveillance system, which the company aims to handle 100,000 live video feeds simultaneously, pulling footage from CCTV, ATM cameras, and office face-scanners. Identity politics are again at the heart of these technologies. The test bed for much of this top-of-the-line surveillance over the past few years has been the fringe province of Xinjiang, home of the Uighur Muslim ethnic minority, which the Chinese state has blamed for a string of terrorist activities. There have been reports that China has been using its advanced surveillance to impose greater central authority and clamp down on the rights of the Uighur population.

SenseTime has emphasized that AI image recognition can be used for good—that it can be used to help find missing children. That’s a sentiment echoed in recent comments by Amazon about its Rekognition system. After the ACLU released public records detailing the technology giant’s relationship with law enforcement agencies, Amazon issued a statement arguing that Rekognition has “many useful applications in the real world,” including finding lost children at amusement parks.

No doubt there is truth in this. No doubt, also, that this isn’t the whole story. The scale and scope of image recognition is expanding, and whether the camera is in our smartphones, on our playgrounds, or on a police officer’s chest, our identities are the target. There may be improvements to stop clever scarves and glasses from spoofing the system, but once you’ve invented the ship, you can’t uninvent the shipwreck. You can’t uninvent the pirate.

“We are aware of updates,” Hyphen-Lab says. “We get them on our phones, computers, cars, and as facial recognition technology develops, we will still postulate on potential ways to subvert its intended use.”