The media is saturated with articles detailing new products that purport to change how humans will interact with computers. 

The latest buzz are bots, which may supplant or augment apps as a method for interfacing with computers. With bots you will be able to speak or chat with an intelligent piece of code rather than input data into a form or app. Companies including Microsoft, IBM and Facebook have introduced bot frameworks, with others surely to come. 

We know there's more to come, but more importantly, what comes next? How will we interact with technology in the world of tomorrow?  

To answer this question, we turn to the university laboratories.  

Some of the academic world’s cutting edge research will be rolled out at next month’s CHI 2016 Conference in San Jose, Calif. (‘CHI’ stands for Computer-Human Interaction). According to the conference website, at the event you might “experience a new gesture interface for tablets, learn how developing countries use mobile phones for maternal health, play soccer against someone 3000 miles away, or debate the future of online education.” Over 500 academic papers will be presented during the conference.

Exciting areas of research include new ways for interacting with computing devices through eye tracking, mobile interactions and wearable technology. What follows are just a few of the many studies and papers that will be presented at the conference. Who knows, someday soon one of these will be the "latest buzz"?

Eye Tracking (or Gaze Paths)

Eye tracking is a process that measures where one is looking or the motion of an eye relative to the head. So far, eye tracking has been used in the laboratory to gauge how people interact with computer screens, but researchers are investigating new applications for using eye tracking to manipulate technology. Here are a few examples:

Gaze-Based Note-Taking for Learning from Lecture Videos

A gaze-based system to assist a user in note-taking while watching lecture videos will be presented. The system automatically reduces the video playback speed or pauses it while a user takes notes, to minimize the user's need to control the video.

Gazed and Confused: Understanding and Designing Shared Gaze for Remote Collaboration

Look into my eye and what do you see? This study investigates how geographically-remote collaborators make use of a graphical representation of their partner’s eye-gaze during collaborative tasks. This research direction may have far-reaching implications, since people use eye gaze as an important cue for monitoring attention and coordinating awareness.

Learning Opportunities

Mobile Interactions

Pre-Touch Sensing for Mobile Interaction

This study explores a pre-touch modality for interacting with computing devices using a self-capacitance touchscreen that can sense multiple fingers above a mobile device, as well as grip around the screen’s edges. Collectively, these techniques illustrate how pre-touch sensing offers an intriguing new channel for mobile interaction.

Emergeables: Deformable Displays for Continuous Eyes-Free Mobile Interaction

Emergeables are mobile surfaces that can deform or ‘morph’ to provide fully-actuated, tangible controls. This study focuses on continuous controls like dials or sliders for fully-dynamic positioning in versatile widgets that can change shape and location, depending on the user’s needs.

Wearables

GauntLev: A Wearable to Manipulate Free-Floating Objects

Can you give me a hand? This study introduces a levitation glove that employs acoustic levitation to capture, move, transfer and combine materials. This technology may have important implications for handling dangerous or fragile materials without physical contact.

Serendipity: Finger Gesture Recognition Using an Off-the-Shelf Smartwatch

This study explores the feasibility of using only motion sensors on everyday wearable devices to detect fine-grained gestures. This technology can be deployed today on current smartwatches and has the potential to be applied to cross-device interactions, or as a tool for research in fields involving finger and hand motion.

The potential for many of these in our daily lives and in the workplace is clear — whether we'll see them in the office any time soon is another matter. But research such as this pushes the boundaries of what we can expect from our hands-on, multiple senses engaged future with technology. 

Check out the rest of the conference papers here.

Title image "Eye Tracking" (CC BY 2.0) by  Dolbs 

fa-solid fa-hand-paper Learn how you can join our contributor community.