HOT TOPICS: Customer Experience Marketing Automation Social Business SharePoint 2013 Document Management Big Data Mobile DAM

UX Alert: Is This the Year Natural Computer Interaction Takes Off?

User experience (UX) discussions commonly assume keyboard, mouse or, more recently, touch screen interaction. Arguably, none of these are natural actions. But a raft of new gestural and other “in-the-air” interaction modes being presented at the Consumer Electronics Show (CES) and elsewhere suggest that more natural interaction is rapidly progressing beyond touch — possibly leading to the dramatic reinvention of UX.

One big picture vision of this possible, near-term future is coming from Intel, which has been calling this trend “Perceptual Computing” since 2010, with similar concepts being voiced by Advanced Micro Devices (AMD), IBM and others. In September, Intel executive vice president Dadi Perlmutter gave an address to a developer’s group in which he described the next big wave of “natural and intuitive” interaction using such modes as gestures, facial and voice recognition, voice and other sensory controls and which have a level of user awareness beyond what personal computing devices currently possess.

Redefining Platform

Last August, AMD’s Chief Technology Officer, Mark Papermaster, presented a similar view at a personal computing conference, with the addition that his vision redefines “platform” to include computational sensors and other devices that invisibly inhabit an entire room. But the interaction is similarly fluid, naturally human and intelligent enough to anticipate human needs and desires through sensors, location-awareness and interpretation of data like calendars and behavior.

Both Intel and AMD contend that the pieces are coming together in what Papermaster calls a “tsunami” of new interactive modes that will create new forms of “contextual value” even while computing disappears into the background.

In December, IBM released its seventh annual “5-in-5” predictions of five innovations that will reach the market within five years. This year’s crop described ways to use the human senses in what the company called the next big wave in computing, which it named “the era of cognitive systems.” The predictions include textural sensations available through a smartphone screen, much more highly developed and commonly-available machine vision systems, sound detection that can determine what a baby’s crying means or predict when a landslide is going to happen and systems that, using tiny sensors, can understand smells and tastes.

Leap Motion, Nuance

Two devices from other technology giants — Google’s emerging augmented-reality glasses and Microsoft’s hit gestural controller, Kinect — are also pointing toward this new era. In addition, a number of small companies are releasing powerful, inexpensive and potentially transformational technology that could dramatically change or augment computer interaction.

Leap Motion.png
Leap Motion's highly accurate gestural interaction technology

Gestural control, for instance, is bursting out all over CES. San Francisco-based start-up Leap Motion, for instance, is demonstrating gestural control that can individually detect ten fingers interacting in the air at a resolution of 1/100th of a millimeter of movement, and the company has seeded 12,000 developers with SDKs. The Minority Report-like software/controller product is expected to be offered later this year for about US$ 70, and Asus has already announced its intention to integrate Leap Motion technology into selected all-in-ones and notebooks.

Elliptic Labs, Tobii

Other companies are similarly showing new generations of gestural tech at CES, including Palo Alto-based Elliptic Labs, which has an ultrasound-based system. Samsung is showing TVs that replace remote controls with speech commands and in-the-air gestural motions, and, in a different kind of body-sensing interaction, Tobii Technology from Sweden is showing its consumer-level eye-tracking technology.

Intel’s Skaugen has said that face-recognition and voice recognition technologies for much better computer security is on the way. At its Haifa, Israel development center, Intel is spending heavily on technologies to develop Perceptual Computing which can, as AMD is envisioning as well, become more “context aware.” Intel is also offering a million dollars in prizes for developers to come up with intuitively easy-to-use Perceptual Computing software, using its Perceptual Computing SDK.

Interaction with computing devices through the ways we interact with other humans is a long-held dream, but, as demonstrated at CES and by the enthusiasm of Intel, AMD, IBM and others, the technology is taking huge new steps this year with powerful and often inexpensive devices. If the momentum continues, we may have to rewrite the textbooks on user experience once again.

 
 
 
Useful article?
  Email It      

Tags: , , , , , , , , , , , , ,
 
 

Resources

 

Featured Events  View All Events | Add Your Event | feed Events RSS