Part of a series: Buildings That Protest
In adventure stories, one recurring trope is the ranger/guide. Wizened, possessed by wanderlust, removed from the main branch of civilization, they take the heroes through a patch of inhospitable land. Their years of living at one with nature allows them to see pathways invisible to the rest of the party. They consider broken twigs, scraped bark, and other obscure signs. Through this, the forest teaches them to track their prey. Sherlock Holmes displays similar abilities in and around London, thanks to years of tireless, idiosyncratic research.
Smart buildings are the means to make Holmeses and rangers of anyone with login credentials. The invisible becomes visible, charted and graphed with hourly break-downs, subdivision by demography, regression analysis, and an easy touch-and-swipe interface. This is what it means to make an environment that talks to us. With the right network and the right surveillance subroutines, the city becomes an enormous Baker Street Irregular. Or an enormous Ministry of Love, depending.
In 1998, when Kevin Warwick implanted a primitive RFID transmitter in his arm, he gained a different superpower — the ability to control the Department of Cybernetics at the University of Reading with his presence. The front entrance said “hello” when he arrived, doors opened and lights illuminated automatically. Control is too strong a word here — the building reacted to him because it knew he was there.
That’s pre-pre-alpha stuff. Just a dumb transmitter and a dumb receiver. Make it two-way. Connect it to the nerves (there’s been lots of work in that area already). Let a paraplegic homeowner open and close doors with only willpower. Let a blind person FEEL what doors are open without having to fumble around. Connect it to something other than doors. Why should air conditioning be based on objective temperatures? Why can’t the building feel that you feel cold and adjust accordingly?
It won’t work properly. This is to be expected. There will be mis-configured firmware and competing disability-control standards. The genius of the Jetsons was that they lived in a future full of scientific marvels and technical wonders that routinely broke down.
It’s an old joke. If cars were more like computers, they’d have fantastical performance specs but they’d crash all the time. It’s all true, and we’re in the process of actually turning houses into computers. A set of houses in the midwest will be plagued by heating routines that mysterious spin up and then turn off the furnace. There will be user forums for trying to undo a preference setting that routes all your calls through the television. Homes will crash, they will lock up, they will need to be rebooted. But we’ll put up with it.
Here’s the part — more than anything else — that gets me. In Understanding Comics, Scott McCloud discuses the way that we extend our identity into the objects that we feel we control. Game designers exploit this routinely.
It’s the same thing that happens when you drive a car. As you drive, you have a sense of the position of the car in space and how far it extends around you. This enables you to parallel park, drive in a lane next to other cars and pull into your garage without crashing. Your senses extend outward, encompassing the car and receiving feedback. As this happens, the car becomes part of you, an extension of both body and self. This is why people who’ve crashed say “You hit me!” rather than “His car hit me!” or “His car hit my car!”
Steve Swink — Game Feel: A Designer’s Guide to Virtual Sensation
What happens, when we extend our senses into our houses? Our cities? When a house cries out in pain? When we feel our neighbourhood?