Natural User Interface (NUI) is clearly intuitive; it annuls the GUI’s need to learn how to interact with the interface, which lately tends not to ease the user’s experience and discovery, but makes suggestions and recommendations. The author of the term NUI is Steve Mann: he personally devised few strategies opposite to ancient Command-Line Interface and now typical Graphical User Interface in last three decades of the twentieth century. Nevertheless, some researchers state NUI is not contrary to GUI, because in most cases the first one includes visual elements, which are the constant component of the second one; anyway, I’m talking about the whole concept, not just the technology aspect – and no one can deny the NUI is a true surrogate for WIMP.
Still can hardly imagine what Natural User Interface is? Well, as an example here’s a hydraulophone – a musical instrument that starts to play by physical interaction with fluid, mainly water. Hydraulophone was invented by the aforementioned pioneer of NUI Steve Mann. For clearer understanding of the instrument’s function mechanism just watch the video that demonstrates the hydraulophone musical performance.
Programming plays a huge role in the world that surrounds us, and though its uses are often purely functional, there is a growing community of artists who use the language of code as their medium. Their work includes everything from computer generated art to elaborate interactive installations, all with the goal of expanding our sense of what is possible with digital tools. To simplify the coding process, several platforms and libraries have been assembled to allow coders to cut through the nitty-gritty of programming and focus on the creative aspects of the project. These platforms all share a strong open source philosophy that encourages growth and experimentation, creating a rich community of artists that share their strategies and work with unprecedented openness.
Daniel Shiffman – Interactive Telecommunications Program at NYU http://www.shiffman.net/
Keith Butters – Barbarian Group http://barbariangroup.com/software/cinder_0_8_0
James George & Jonathan Minard – RGBDToolkit http://www.rgbdtoolkit.com/
*****Here’s a list of all the projects displayed!*****
Phyrid – https://soundcloud.com/phyrid
Jinx – http://www.youtube.com/watch?v=dCsVziasq-g
Hopeku – http://www.newgrounds.com/audio/listen/338635
Dexter Britain – https://soundcloud.com/dexterbritain
Milhaven – http://milhaven.bandcamp.com/track/look-victory
Binarpilot – http://www.jamendo.com/en/artist/1125/binaerpilot
The iOptik glasses and lenses aren’t a self-contained platform like Glass or Vuzix glasses, even leaving aside the bifurcated form factor. Instead, the company plans to partner with more consumer-facing brands that can help bring it to market. When that happens, it’ll target both the nascent heads-up display sector created by Project Glass and immersive gaming like what we’re seeing with the bulkier Oculus Rift. For now, prototypes are shipping for military testing — the Pentagon placed an order earlier this year.
Skin Games represents an original interaction paradigm for computer games, building on the concept of “kinetic interfaces” and Spatial Augmented Reality: in ‘Skin Games’ the body acts simultaneously as the controller and as the (wildly deformable) projection surface on which to display the game’s output. This is a proof-of-principle demonstration rendered possible thanks to the Laser Sensing Display technology.
Innovega’s approach requires FDA approval because it involves wearing a specialized contact lens. And that process, co-founder and CEO Steve Willey told me, won’t be in the cards until 2014. Yet Willey, whose company is based in San Diego and Seattle, is making impressive progress and, unlike at last year’s CES, Willey now has a way to show how it all works. The goal here is to get away from the Google model, which uses what’s known as a glanceable display. When you look through those types of specialized glasses, you see a postage-stamp type image off to the side that shows media — your text messages, say, along with your Twitter feed or, potentially, ads. (It’s Google, after all.)
While this is great, Willey said it falls short of what people will eventually want — a full-media overlay that either becomes the only thing the user can see (as would be necessary for a video game), or a mix of media and reality. The problem with creating the full, panoramic view is that human eyes can’t focus on objects that are right up against them. That’s where the specialized contact lenses come in; Innovega’s lenses enable the wearer to focus on objects that are superclose while also focusing on whatever’s in the distance.
The other part of the setup is fairly simple: A small camera attaches to a pair of lightweight glasses — in theory, they could be any sports glasses — that projects the media onto the lenses. Because of the contact lenses, you — or, for now, the mannequin — can focus on an overlay displayed across the lens of the eyeglasses.
“People want a big image,” said Willey, who formed the company in 2008. “Natural vision is full HD and panoramic, and what we’re delivering starts to rival natural vision — a blend of virtual and real world for cool entertainment depending on where you’re standing, where you’re looking. The main thing is that you see both in perfect focus.”
His key customer so far is the military. Last spring, Innovega won a contract to supply the Defense Advanced Research Projects Agency (DARPA) with a prototype of its iOptik spectacles and accompanying contact lenses. The goal is to offer soldiers in harsh conditions a way to get information about battles without having to look at a handheld device or interfering with their normal view. “If you’re in the middle of a desert in sunshine, a handheld doesn’t work,” Willey pointed out. “The military wants a rich display that gets a device out of their hands.”
Now Willey is hoping some big consumer companies also want in. He said he talked to a few last year, hoping to find strategic partners, but people didn’t believe him. “They all said it sounded like science fiction,” he said.
So Willey rigged up the mannequin to demonstrate that they can make a person see the real and virtual world around them. And should he win FDA approval, the market could be huge: These contact lenses could replace regular contacts for many people, he said, because they work fine even when you’re not looking at a virtual world.
It’s easy to imagine plenty of use cases — a surgeon, say, who is watching images within a body while operating — but Willey says he wants to tackle the consumer market, hoping to partner with, say, Microsoft, Sony, or Qualcomm, to bring this to market in ways to be determined. I could see use cases for athletes, and plenty more. Willey said he feels certain of the appeal for gamers. “3D gaming is still on a flat screen,” he said. “What you really want is to believe you’re inside the game.”
For a sense of the whole thing, check out this video:
Leap has unveiled their first in-house app, a Jenga-inspired game called Block 54 that was engineered by one of their interns.
The game mechanic is all about pushing and grabbing, another sign of Leap’s preference for physically intuitive motions. Buckwald is vocal about maintaining direct physical-to-virtual interactions instead of a more abstract sign language. In Block 54′s case, that means your actions are all familiar — poking, pinching — but they operate on virtual blocks instead of real ones. It also means the skills of the game are roughly the same ones that let you win a real game of Jenga.
The new SDK commits to that UI philosophy even further, providing a codebase for basic actions like gripping, pushing and molding objects in virtual space. With this update, developers will have a library of approved gestures to draw from, with simple code they can drop in whenever they need a gripping mechanic. That makes coding easier and cheaper for developers, as long as they’re working in Leap’s preferred style of interaction design.
The initial applications for liquid crystal-based contact lens display might be to help control light transmission in people with damaged irises or replace colored contacts, allowing wearers to change the color or pattern on the go. They also imagine these contacts working as adaptable sunglasses. However, since the lenses can project images sent to them wirelessly, the potential is there for these displays to show directions, texts from a smart phones or ultimately provide an augmented reality experience predicted in this video.
The Centre of Microsystems Technology’s Ghent University-based team recently declared that they have developed a spherically curved LCD display which can be imbedded into contact lenses, the implication being that in future we could all be steaming films directly on to our eyeballs.
Whilst contact lens displays aren’t an entirely new technology, researchers at the University of Washington tested LED based lenses on rabbits back in 2011, the real breakthrough made here is the use of LCD displays. Previous LED-based displays meant that the content which could be displayed was limited to only a few small pixels located in the middle of the lens, however the ground breaking LCD-based technology allows for the use of pixels across the whole surface. This is possible thanks to some very cleaver development ideas using thin conductive polymer films integrated into a smooth spherical cell.
Along with the announcement the researches also showed off a prototype which demonstrates a dollar sign being displayed on the curved lens. So far the display is limited to only fairly simple patterns and unfortunately it seems that the image cannot be seen by the wearer at this stage.