1. 程式人生 > >How Tech Companies Create Your Digital Twin

How Tech Companies Create Your Digital Twin

Two decades ago, Kevin D. Haggerty and Richard V. Ericson introduced the concept of what they called “the surveillant assemblage.” Before Facebook and Twitter, and before the iPhone or YouTube, the Apple Watch or the Fitbit, Haggerty and Ericson saw the potential for a new frontier in surveillance: data.

“We are witnessing a convergence of what were once discrete surveillance systems to the point that we can now speak of an emerging ‘surveillant assemblage’,” they wrote. “This assemblage operates by abstracting human bodies from their territorial settings and separating them into a series of discrete flows. These flows are then reassembled into distinct ‘data doubles’ which can be scrutinized and targeted for intervention.”

There are now, as you read this, two of you. There is the real-life you, sitting at your computer or scrolling on a mobile device, and then there is the other you — your data double — an amalgam created entirely from algorithms and computer programs. These programs analyze the trail of information left behind as the real you shops online, chats with friends, or posts fitness information on social media.

This information is what feeds the persuasion architectures — the myriad surveillance technology apparatuses — that surround us, whether we’re actively engaged online or just walking around with our mobile phone in our pocket. Endless and unseen, these connections are being constantly created. Guesses as to what we might like to do or see next, based on the digital persona we’re steadily creating, our data double.

In the most benign sense, this might mean we’ll see more ads for running shoes or fitness apps. But the algorithms’ connections don’t operate only within the parameters of a single type of interest. They go on and on, searching for more and more data points with which to connect our own.

The problem, Tufekci said, is that “we no longer really understand how these complex algorithms work.”

As Zeynep Tufekci put it in a TED talk last year, the problem is not so much that people might, as a function of the algorithms, see advertising they don’t want. The problem, Tufekci said, is that “we no longer really understand how these complex algorithms work.”

“We don’t understand how [the algorithms] are doing this categorization,” she said. “It’s like we’re not programming anymore, we’re growing intelligence that we don’t truly understand.”

We may yet come to a point where we are unable to understand something else: life beyond what the algorithm decides we want.

As we put more and more faith in the devices that, like little pocket-sized, wrist-worn oracles, reveal such mesmerizingly accurate details about our bodies and our lives, the more we will come to trust them as sources of ultimate truth. We will, and in some cases already have, live inside the reality they create for us. Along the way, experiences outside that device-driven reality will, to paraphrase Jaron Lanier, become as opaque as the algorithms that drive those inside it.

When the information we see — the people and news we are exposed to — is decided by our data double, we begin to lose a part of ourselves. Something else happens, too. We begin to lose track of which self is which. Data doubles seem accurate — as surveillance critic David Lyon once put it, they can appear as “more real to the surveillance system than the bodies and daily lives from which the data have been drawn” — but they’re ultimately out of our control.

Instead, power resides with the algorithms that build the connections that create our networked shadow. What happens when they’re wrong? The results can be frustrating, or even devastating. Algorithmic faults “have seen voters expunged from electoral rolls without notice, small businesses labeled as ineligible for government contracts, and individuals mistakenly identified as ‘deadbeat’ parents,” Wired summarized back in 2014. And even when the algorithms are accurate, recognizing us perfectly, we might wish they weren’t.

But there’s no turning back. Each of us is now a split person — one human, one digital. As we become more and more reliant on the devices in our pockets or on our wrists, we — not to mention the companies and governments that rely on our accumulated data to determine our identity — may become less and less certain of which version of us is really us.