You are previewing Interactive Displays: Natural Human-Interface Technologies.
O'Reilly logo
Interactive Displays: Natural Human-Interface Technologies

Book Description

How we interface and interact with computing, communications and entertainment devices is going through revolutionary changes, with natural user inputs based on touch, voice, and vision replacing or augmenting the use of traditional interfaces based on the keyboard, mouse, joysticks, etc. As a result, displays are morphing from one-way interface devices that merely show visual content to two-way interaction devices that provide more engaging and immersive experiences. This book provides an in-depth coverage of the technologies, applications, and trends in the rapidly emerging field of interactive displays enabled by natural human-interfaces.

Key features:

  • Provides a definitive reference reading on all the touch technologies used in interactive displays, including their advantages, limitations, and future trends.

  • Covers the fundamentals and applications of speech input, processing and recognition techniques enabling voice-based interactions.

  • Offers a detailed review of the emerging vision-based sensing technologies, and user interactions using gestures of hands, body, face, and eye gazes.

  • Discusses multi-modal natural user interface schemes which intuitively combine touch, voice, and vision for life-like interactions.

  • Examines the requirements and technology status towards realizing "true" 3D immersive and interactive displays.

  • Table of Contents

    1. Wiley-SID Series in Display Technology
    2. Title Page
    3. Copyright
    4. About the Author
    5. List of Contributors
    6. Series Editor's Foreword
    7. Preface
    8. List of Acronyms
    9. Chapter 1: Senses, Perception, and Natural Human-Interfaces for Interactive Displays
      1. 1.1 Introduction
      2. 1.2 Human Senses and Perception
      3. 1.3 Human Interface Technologies
      4. 1.4 Towards “True” 3D Interactive Displays
      5. 1.5 Summary
      6. References
    10. Chapter 2: Touch Sensing
      1. 2.1 Introduction
      2. 2.2 Introduction to Touch Technologies
      3. 2.3 History of Touch Technologies
      4. 2.4 Capacitive Touch Technologies
      5. 2.5 Resistive Touch Technologies
      6. 2.6 Acoustic Touch Technologies
      7. 2.7 Optical Touch Technologies
      8. 2.8 Embedded Touch Technologies
      9. 2.9 Other Touch Technologies
      10. 2.10 Summary
      11. 2.11 Appendix
      12. References
    11. Chapter 3: Voice in the User Interface
      1. 3.1 Introduction
      2. 3.2 Voice Recognition
      3. 3.3 Deep Neural Networks for Voice Recognition
      4. 3.4 Hardware Optimization
      5. 3.5 Signal Enhancement Techniques for Robust Voice Recognition
      6. 3.6 Voice Biometrics
      7. 3.7 Speech Synthesis
      8. 3.8 Natural Language Understanding
      9. 3.9 Multi-turn Dialog Management
      10. 3.10 Planning and Reasoning
      11. 3.11 Question Answering
      12. 3.12 Distributed Voice Interface Architecture
      13. 3.13 Conclusion
      14. Acknowledgements
      15. References
    12. Chapter 4: Visual Sensing and Gesture Interactions
      1. 4.1 Introduction
      2. 4.2 Imaging Technologies: 2D and 3D
      3. 4.3 Interacting with Gestures
      4. 4.4 Summary
      5. References
    13. Chapter 5: Real-Time 3D Sensing With Structured Light Techniques
      1. 5.1 Introduction
      2. 5.2 Structured Pattern Codifications
      3. 5.3 Structured Light System Calibration
      4. 5.4 Examples of 3D Sensing with DFP Techniques
      5. 5.5 Real-Time 3D Sensing Techniques
      6. 5.6 Real-Time 3D Sensing for Human Computer Interaction Applications
      7. 5.7 Some Recent Advancements
      8. 5.8 Summary
      9. Acknowledgements
      10. References
    14. Chapter 6: Real-Time Stereo 3D Imaging Techniques
      1. 6.1 Introduction
      2. 6.2 Background
      3. 6.3 Structure of Stereo Correspondence Algorithms
      4. 6.4 Categorization of Characteristics
      5. 6.5 Categorization of Implementation Platform
      6. 6.6 Conclusion
      7. References
    15. Chapter 7: Time-of-Flight 3D-Imaging Techniques
      1. 7.1 Introduction
      2. 7.2 Time-of-Flight 3D Sensing
      3. 7.3 Pulsed Time-of-Flight Method
      4. 7.4 Continuous Time-of-Flight Method
      5. 7.5 Calculations
      6. 7.6 Accuracy
      7. 7.7 Limitations and Improvements
      8. 7.8 Time-of-Flight Camera Components
      9. 7.9 Typical Values
      10. 7.10 Current State of the Art
      11. 7.11 Conclusion
      12. References
    16. Chapter 8: Eye Gaze Tracking
      1. 8.1 Introduction and Motivation
      2. 8.2 The Eyes
      3. 8.3 Eye Trackers
      4. 8.4 Objections and Obstacles
      5. 8.5 Eye Gaze Interaction Research
      6. 8.6 Gaze Pointing
      7. 8.7 Gaze Gestures
      8. 8.8 Gaze as Context
      9. 8.9 Outlook
      10. References
    17. Chapter 9: Multimodal Input for Perceptual User Interfaces
      1. 9.1 Introduction
      2. 9.2 Multimodal Interaction Types
      3. 9.3 Multimodal Interfaces
      4. 9.4 Multimodal Integration Strategies
      5. 9.5 Usability Issues with Multimodal Interaction
      6. 9.6 Conclusion
      7. References
    18. Chapter 10: Multimodal Interaction in Biometrics: Technological and Usability Challenges
      1. 10.1 Introduction
      2. 10.2 Anatomy of the Mobile Biometry Platform
      3. 10.3 Case Study: Usability Study for the Visually Impaired
      4. 10.4 Discussions and Conclusions
      5. Acknowledgements
      6. References
    19. Chapter 11: Towards “True” 3D Interactive Displays
      1. 11.1 Introduction
      2. 11.2 The Origins of Biological Vision
      3. 11.3 Light Field Imaging
      4. 11.4 Towards “True” 3D Visual Displays
      5. 11.5 Interacting with Visual Content on a 3D Display
      6. 11.6 Summary
      7. References
    20. Index
    21. End User License Agreement