In 2008, Australian show A Current Affair broadcast an episode that included a brief hypnotherapy session. The segment was called Think Slim and the idea was that it would help viewers lose weight. This was found to be in breach of the Commercial Television Industry Code of Practice which specifically forbids broadcasting shows “designed to induce a hypnotic state in viewers”.
Take a look at this video of an augmented-reality physics app that Bruce Sterling linked to on Beyond the Beyond. You sketch objects in 2d and then the app figures out what they should look like and renders them in 3d. This results in simulated 3d objects that interact with one another and gravity. It’s all very cool, but pay special attention to how it’s controlled.
Did you see that? They use a scrap of paper marked with play or pause, which they slide into the frame when they want the simulation to start or stop. This strikes me as a big deal. It transforms the electronic eye from a data gathering device into something that can receive instructions. A control device.
When this thing is running, images become executable. You can send the computer commands visually. You can hack it with pictures.
Computers you can hypnotize.
The thing that I find so appealing about retinal scanners is that it’s a technological re-imagining of the salt-of-the-earth gut-check folk wisdom of the need to look someone in the eyes. The machine peers into the depths of your soul and decides if you are who you really say you are and whether you should be allowed in.
Unless, of course, you are a guard rendered unconscious by the super-agent and dragged up to the scanner. Or you are a super-agent in possession of a scan of someone’s eye.
One way or another, the door gets opened.
In 1997, 685 Japanese children were taken to hospitals because of seizures induced by an episode of Pokemon. The seizures were caused by a screen-filling pattern of flashing red and blue lights. When the news media picked up the story, some stations showed clips, which in turn caused more seizures.
The full episode was never aired again.
The G60 grenades doesn’t kill, but they do produce a flash bright enough to blind targets and a bang loud enough to deafen, stun and even dizzy them. They are also known as flashbangs. Some version of the grenade use multiple detonation to further disorient opponents.
The advent of 2d bar codes promises to make an increasingly machine-readable world. (It was meant to be RFID, but for my money, in an environment where every mobile device already has a camera, visual codes seem likely to win out.) Mix that with various augmented reality applications and the inconvenience of hitting “accept” all the time, and you have a powerful new vector for virii.
Did you think that in a world where people happily install any old .exe that promises free porn or cat pictures and hand over banking details to sites that more or less look like their bank’s that mobile devices would somehow be free of this kind of problem?
I was sitting in on a virtual meeting using Second Life. The sim was dressed up to look as much like a regular meeting as possible. It featured a stage, presenter, and a projection screen with powerpoint slides.
“Please stop clicking on the projection screen,” they begged us, “the permissions haven’t been set right and anyone can advance the presentation by accident.”
Picture someone wearing a Sixth Sense rig or some other always-on smart-camera gear with overly trusting settings wandering into the wrong alley with the wrong images on the wall. Picture some jerk with a laser pulsing at a rate known to cause buffer overflows in unpatched rigs, carefully aiming at rich-looking patrons from across the dance floor.
Visual input is messy and low-fi. It has to be, because of the wide variety of lighting and orientation situations with which the viewer is likely to be presented. Our brains automatically reorient and abstract the world. It’s a skill that lets us get by but it’s also a hurdle that artists need to unlearn in order to draw precise, verisimilar pictures of the world.
Putting control systems in optic inputs means teaching computers to reorient and abstract the world and in that translation lies vulnerability. Passwords work because there is only one way to enter your password. You either type in the correct string of characters or you don’t. Retinal scanners try to control the input by insisting on a narrow range of acceptable positions from the eyeball being scanned, but there is a lot of image processing required to turn that into a yes or no.
Facial recognition remains a nightmare, easily defeated by tilted heads, sunglasses, and bright smiles.
So when visual controls and codes are made they have to be highly error tolerant. Which means easily forged inputs. How do you protect against that? When we automate inputs, we aim for convenience, but we set ourselves up for a whole host of problems. Ambiguity breeds vulnerability.
Imagine QR code phishing, where a malicious user pastes a new sticker over a trusted one in an official location. Imagine your brat sister dropping her own play and pause scraps into the scene when you’re trying to put together your 3d drafting assignment. Imagine your jerk co-worker painting his nails your signature sixth sense colours. Imagine elite teams of special forces tossing spaz grenades, flashing their unique patterns that disable or reprogram the other guy’s auto-turrets.
Imagine a digital sleight-of-hand artist.
“The hand is quicker than the camera…”