Preface

So, what are “interactive displays”? We define them to be the displays that not only show visual information on the screens, but also sense and understand human actions and receive direct user inputs. Interactive displays that can “feel” the touch of our fingers are already ubiquitous, especially on mobile devices and all-in-one computers. Now, the addition of human-like perceptual sensing and recognition technologies is allowing the development of a new class of interactive displays and systems that can also “see”, “hear”, and “understand” our actions in the three-dimensional space in front of and around them.

We use multisensory and multimodal interface schemes to comprehend the physical world surrounding us and to communicate with each other in our daily lives, seamlessly combining multiple interaction modalities such as touch, voice, gestures, facial expressions, and eye gaze. If we want human-device interactions to approach the richness of human-human interactions, then we must endow the devices with technologies to sense and understand such natural user inputs and activities. The addition of natural human-interfaces can thus bring lifelike experiences to human-device interactions.

The ways in which we interact with computers have already gone through a transformation in recent decades, with graphical user interfaces that use a mouse and keyboard as input devices replacing the old command-line interfaces that used text-based inputs. We are now witnessing the next ...

Get Interactive Displays: Natural Human-Interface Technologies now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.