Why Ambient Computing Is One Of Few Truly Disruptive Technologies

A History Of Mimicry
Every new technology begins by mimicking its predecessor. When radio started, there were no special sound effects— just people talking and playing music. When TV started, it utilized the same techniques as radio, just with shots of the people talking. Digital advertising mimicked the ads in papers from 30 years ago at its genesis. For every new platform, its capabilities don’t fully emerge until it sheds the weight of its genesis.
This is why many companies go under when a new technology emerges— they try and mold it to the logic of the old technology. For instance, when customer service moved from desktop IM to mobile chat, the products that had ruled desktop IM fell short because of a very simple technological distinction between the two platforms: mobile has push notifications. Desktop IM failed because it had to be in real-time— when there was a wait, the customer would go do something else (on or off the computer), and there was no system for re-engagement. Mobile has the ability to engage in asynchronous communication because users always have their mobile phones with them, and can leave a conversation until a push notification ropes them back in.
Tiny distinctions— like push notifications— between technologies are hugely important both in terms of how companies use them, and how the mediums progress.
What Happens When There’s Nothing To Mimic?
The iPhone was disruptive for one simple reason: its touch screen. It eliminated the mouse, stylus, pen, keyboard, etc, and instead utilized a bodily appendage as the go-between for intention and outcome. Eliminating a pen in favor of a finger is not the same as moving a show from radio to TV. You are not adding new capabilities; you’re eliminating entire tools.
The touchscreen never had the baggage of a predecessor— it was a truly disruptive technology. Ambient computing will be the same, if we do it right.
Computers As Extensions Of Physicality
Ambient computing eliminates the need for a physical touchpoint. It makes computing an extension of physicality— one has only to talk, to move past, to gesture. Amazon Go, for instance, the still-in-beta grocery store eliminates credit cards, physical receipts, checkout lines, substituting them all with a body moving past a sensor, a hand picking up a product.
Taken to its extreme, ambient computing would literally become a part of one’s physicality— a chip in the body. And this isn’t some far-fetched dystopian idea, either. Swedish technology firm, Epicenter, implanted 150 of its staff with rice-sized microchips that are inserted via syringe into the hands of employees just a few months ago. The chips (which were all inserted voluntarily), simply replace other technologies like key cards and credit cards. They allow employees to pay at the office cafe, swipe in and out of the building, and operate photocopiers and other machines. The chip just eliminates external appendages and centralizes tools within the body.
Naturally, implanted chips come with a host of privacy concerns. Employers could conceivably gain immense data on their chipped employees— data that has nothing to do with the workplace. They could find out health information, data on how often employees are using the bathroom, information on an employee’s whereabouts during and outside of work hours. This level of tracking would obviously be highly unethical and a major breach of privacy.
This fear, though, is not so different from fear that originally accompanied smartphones. While ambient computing has the potential for an Orwellian future, it also has the potential to be just a useful tool— a new user interface.
Ambient computing will be truly disruptive because it will eliminate other technologies completely. The barriers between computers and bodies will break down, and a new precedent will be set.