1. 程式人生 > >We know where you are and we know what you are doing

We know where you are and we know what you are doing

We know where you are and we know what you are doing

Carnegie Mellon University researchers are using laser vibrometry — a method similar to one once used by the KGB for eavesdropping — to monitor vibrations and movements of objects, enabling smart devices to be aware of human activity.. Credit Carnegie Mellon University

KGB technology lets AIs know exactly what is happening in your home.

In the 1950s state of the art spying technology allowed the Russians to eavesdrop on conversations from outside of the building in which they were taking place.

The idea was simple: sounds inside a room caused the glass in the windows to vibrate, so any conversation could be overheard by detecting and amplifying these tiny vibrations.

What they did was direct a beam of light at the window and then measure the tiny perturbations in the frequency of the reflected light. The changes, when amplified, were the same as the sound from inside the room, so you could hear the conversation.

And now, Vibrosight

The same sort of idea is being used by researchers in the Future Interface Group, at Carnegie Mellon University, for slightly less nefarious purposes, they call it Vibrosight

.

By attaching reflective patches to various surfaces in a room they can measure the vibrations from those surfaces.

A moveable light beam scans the room illuminating at each reflective patch in turn and the reflection the light shines on picked by a sensor and, in just the same way as with the old KGB technique, vibrations from the surface produce small changes in the returning light. The vibrations are not the sound of voices, in this case, but the sounds caused by using equipment or things moving around.

And it turns out that the vibrations that are reflected from, say, a kitchen worktop, a fridge door, or a bench in a workshop, give you a pretty good indication of what is happening in that particular location.

Carnegie Mellon University researchers have developed a method that enable smart devices to figure out where they are and what people are doing around them by analyzing sounds from their microphones. Credit, Carnegie Mellon University

Ubicoustics uses microphones

Another team in the same group are working on Ubicoustics. This in the same area but using a more traditional method of capturing sound — microphones.

Many of our devices come equipped with microphones, our smartphones, laptops, watches and, of course, our new digital assistants like Alexa.

This second team has built a library of sounds from movie sound effects. From these they can simulate phones ringing, someone knocking on the door, a cough — lots of different sounds — and mix them with various background noises to produce a corpus of sounds that can be used to train an AI.

Then sounds picked up from our various smart devices can be detected and recognised by an AI.

The AI knows what you are up to

When you train an AI system on these data from each of these systems, you can get a pretty accurate picture of what is happening in the room or indeed a house.

In a kitchen they can tell exactly what what you are up to — chopping vegetables, using the blender, grinding coffee and so on.

This all gives a context to the commands you give to a home assistant enabling their response to be more accurate.

And it’s not much of a stretch to imagine it tracking all your movements around the house — when the coffee machine finishes brewing in the kitchen and then, a minute later, someone is detected sitting on the sofa in the living room, it’s not tricky to work out what’s going on.

So, Big Brother might not be watching you but he could be listening.