The Invisible Hand

Contrary to this recent Visine commercial, the thing that really saves us all from burning our eyes out on all the screens we have in our lives is that we still have to interact with the physical world. As much as we don’t want to, we have to pull our eyes away from our TVs and tablets and phones and laptops and take out the trash or fix dinner or change the baby. The physical world needs to be acted upon directly.

Maybe not anymore.

Ex-Touch, a new joint project by the Tangible Media Group at MIT and Sony, is a technology that allows a person to act on a physical object through augmented reality, in this case on a tablet interface. Basically, you move an image of the object on the live video on your iPad with multi-touch gestures, and the object moves in real life as if being manipulated by Professor Charles Xavier.

It’s kind of hard to envision (but only because there are a ton of ways that could be envisioned), so I suggest watching the above video. Especially since this is the language the original post uses:

The team demonstrates the system used for applications such as an omnidirectional vehicle, a drone, and moving furniture for reconfigurable room. They envision that proposed spatially-aware interaction provide further enhancement of human physical ability through spatial extension of user interaction.

Obviously, if such an advance could be made practical, especially if incorporated into wearable technologies, it will impact a wide range of important activities from clean room experiments to infectious disease procedures to unmanned probes. But I prefer to think we’re all just going to use it to pretend our houses are haunted.