By David Ponce
Much as the industry is constantly evolving in terms of faster computers and ever more ingenious and polished operating systems, the very basic interaction between man and machine has remained pretty constant over the last decades. You still have a flat screen projecting flat images at a user sitting in front of it. Sure, some of these images may depict a three dimensional object, but the images themselves are still 2D. A project by Jinha Lee and Cati Boulanger, former intern and researcher respectively, at Microsoft Applied Sciences would change all that. They’re using a special transparent OLED screen from Samsung and a series of sensors, along with custom software that reshuffles the keyboard to the back of the screen. So in a way, you’re now working with your hands inside the virtual desktop and you’re free to manipulate what you see. Sensors detect your motions and even where your head is in relation to the screen so as to maintain proper perspective at all times (think of that scene in the latest Mission: Impossible).
There are no concrete plans to put this into production but as a proof of concept allows us to play and discover potential new interfaces for systems of tomorrow. Watch it in action below.