Bristol University scientists presenting a method for creating three-dimensional haptic shapes in mid-air using focused ultrasound. This approach applies the principles of acoustic radiation force, whereby the non-linear effects of sound produce forces on the skin which are strong enough to generate tactile sensations, a technology we hope will soon be available in conjunction with Microsoft HoloLens for a perfect holographic tactile experience.
This mid-air haptic feedback eliminates the need for any attachment of actuators or contact with physical devices. The user perceives a discernible haptic shape when the corresponding acoustic interference pattern is generated above a precisely controlled two-dimensional phased array of ultrasound transducers.
New features of Google Glass project unveiled, with the company encouraging “creative individuals” to pitch in ideas. Features shown in this video posted to YouTube include voice activated commands such as “take a picture” and features people having Skype like video chats on the displays, which appear just in front of the eyes. It also has sat nav, and the ability to record film, translate words and pull up pictures when prompted.
Big Data and Machine Learning
All of the context that is derived from the Internet of Things will generate huge amounts of data (so called Big Data) and using techniques such as Machine Learning to sift and learn from that data will enable technology to do much more on our behalf and actually begin to anticipate our needs. I’m sure that’ll be fun maybe one day it will be useful vs bloody frustrating.
Ray discusses his new role at Google, how his research interests connect with his latest book How To Create A Mind and how technology will advance to produce a “cybernetic friend”
“The project we plan to do is focused on natural language understanding,” said Kurzweil. “We want to give computers the ability to understand the language that they’re reading.”
Natural User Interface (NUI) is clearly intuitive; it annuls the GUI’s need to learn how to interact with the interface, which lately tends not to ease the user’s experience and discovery, but makes suggestions and recommendations. The author of the term NUI is Steve Mann: he personally devised few strategies opposite to ancient Command-Line Interface and now typical Graphical User Interface in last three decades of the twentieth century. Nevertheless, some researchers state NUI is not contrary to GUI, because in most cases the first one includes visual elements, which are the constant component of the second one; anyway, I’m talking about the whole concept, not just the technology aspect – and no one can deny the NUI is a true surrogate for WIMP.
Still can hardly imagine what Natural User Interface is? Well, as an example here’s a hydraulophone – a musical instrument that starts to play by physical interaction with fluid, mainly water. Hydraulophone was invented by the aforementioned pioneer of NUI Steve Mann. For clearer understanding of the instrument’s function mechanism just watch the video that demonstrates the hydraulophone musical performance.
Leap operates in three dimensions rather than two. Forget pinch-to-zoom; imagine “push to scroll,” rotating your flattened hand to control the orientation of an object with a full six degrees of freedom, or using both hands at once to control either end of a bezier surface you’re casually sculpting as part of an object you’ll be sending to your 3D printer.
The fact that the Leap can see almost any combination of objects – a pen, your fingers, all 10 fingers at once, should make every interface designer on the planet giddy with anticipation. If you thought that the touchscreen interface on the iPhone and subsequent tablets opened up a whole new way to interact with your device, imagine something that combines the intuitiveness of that experience with the possibility of such fine-grained control that you could do away with the trackpad or mouse entirely.
Organized Wonder is a way to share and discover the best talks, documentaries, interviews, short films and various other videos scattered across the web. www.organizedwonder.com
In this video from PSFK CONFERENCE NYC, Steve Clayton talks about the drive at Microsoft to embrace the ‘natural user interface’. Clayton, who says his job is to find out what amazing projects the tech firm is working on and share it with the world, takes us through a future-forward vision where gesture, sound and artificial-intuition creates a world that extends the possibilities of our creativity.
A Natural User Interface (NUI) for professional sound design had to happen sooner or later. The 28-year-old Chris Vik began tinkering with the technology a little more than a year ago after being inspired by YouTube videos of art projects based on the Kinect.
”I actually traded in my Xbox to get the Kinect. The guy at the store was very confused until I explained I had seen a bunch of videos on the internet and I planned to spend a lot of time working out how to tinker with it.”
By the end of last year he had completed his Kinectar software and made it available as a free download on his website.
He says the technology lets musicians engage with audiences in new ways.