Today I interviewed Ron Cobb (Alien, Aliens, Back to the Future, Last Star Fighter, Star Wars, etc.) and Syd Mead (2010, Tron, Blade Runner, Aliens, etc.) for GreenlightSummit.com Web Conference. After the session Ron wanted to continue the discussion with his thoughts on how the future is perceived from a vantage point of now or the past and that the future is probably much less recognisable if we were to actually time travel from now to some point in the future.
Ron and I would love to continue this discussion with Syd and many others.
My NUI research is based on the near future as you have seen from my blog and looking further out becomes far more fragmented and random for prediction as you will hear more about from Ron.
Leap is a cool new Natural User Interface to interact with computers. It’s more accurate than a mouse, as reliable as a keyboard and more sensitive than a touchscreen. It will allow control of computers in three dimensions with your natural hand and finger movements.
Ever wondered what the future of writing and drawing might look like? Three guys from Sydney (Rob, Nav and Sumo) have been working on a startup in stealth mode for the last year called Collusion (collusionapp.com). On Wednesday May 23rd Collusion will be unveiled to the public for the first time.
Collusion could change the nature of how people collaborate through writing and drawing… and in the process transform the iPad from media consumption device into an intuitive creative tool. Collusion incorporates a dedicated high precision digital pen (not stylus), iPad App and the fastest cloud collaboration service in the world.
This is a demonstration of the fully immersive KeckCAVES (http://keckcaves.org) VR lab. Two Microsoft Kinects are capturing the people interacting and providing 3d feedback in the VR glasses.
Artist Aaron Koblin takes vast amounts of data — and at times vast numbers of people — and weaves them into stunning visualisations. Although this is about a year old now it still resonates with developments in crowd-sourcing and augmented reality during the past year.
A sandbox equipped with Kinect 3D camera and a projector to project a real-time coloured topographic map with contour lines onto the sand surface. The sandbox lets virtual water flow over the surface using a GPU-based simulation of the Saint-Venant set of shallow water equations. This is based on the original idea shown in this video: http://www.youtube.com/watch?v=8p7YVqyudiE
CS6 brush features demo using Wacom Intuos5 pen tablet
MirageTable, a 3-D stereoscopic projector projects content directly on top of the curved screen. The information is captured by the Kinect camera, which also tracks the user’s gaze. This enables presentation of correct perspective use to a single user on top of the dynamic changing geometry of the real world.
You can see where this stuff is going and it won’t be long either!
Microsoft Research, working with the University of Washington, has developed a Kinect-like system that uses your computer’s built-in microphone and speakers to provide object detection and gesture recognition, much in the same way that a submarine uses sonar. Called SoundWave, the new technology uses the Doppler effect to detect any movements and gestures in the proximity of a computer.