[Preface]

What Is This Book About?

FROM THE KEYBOARD, MOUSE, and touchscreen, to voice-enabled assistants and virtual reality, we have never had more ways to interact with technology. Called modes, they allow people to enter input and receive output from their devices. These inputs and outputs are often designed together in sets to create cohesive user interfaces (UIs). These modes reflect the way our senses, cognitive functions, and motor skills also work together in sets called modalities. Human modalities have existed for far longer than our interface modes, and they enable us to interact with the physical world. Our devices are only beginning to catch up to us. We can now jump and move around in our living rooms to play a game using Microsoft’s motion-tracking peripheral, Kinect. We can ask Domino’s to deliver a pizza using the Amazon Echo.

We often use several modalities together in our daily activities, and when our devices can do the same, they are considered multimodal UIs. Most UIs are already multimodal, but because they’re so familiar we don’t tend to think of them that way. In fact, almost all designed products and environments are multimodal. We see a door and knock on it, waiting for it to open or to hear someone inside ask who it is. We use our fingers to type on a keyboard, and see characters appear on the screen in front of our eyes. We ask Siri a question and see the oscilloscope-like waveform to let us know we are being heard. We receive a phone call and feel the vibration, hear the ringtone, and see the name of the person on the screen in front of us. We play a video game and are immersed in sensory information from the screen, speakers, and the rumble shock controller in our hands.

Multimodal products blend different interface modes together cohesively. They allow us to experience technology the same way we experience our everyday lives: across our senses. Good multimodal design helps us stay focused on what we are doing. Bad multimodal design distracts us with clumsy or disjointed interactions and irrelevant information. It pulls us out of our personal experience in ways that are at best irritating and at worst dangerous.

As technology is incorporated into more contexts and activities in our lives, new types of interfaces are rapidly emerging. Product designers and their teams are challenged to blend modalities in new combinations for new products in emerging categories. They are being asked to add new modalities to the growing number of devices we use every day. This book provides these teams with an approach to designing multimodal interactions. It describes the human factors of multimodal experiences, starting with the senses and how we use them to interact with both physical and digital information. The book then explores the opportunities represented by different kinds of multimodal products and the methodologies for developing and designing them. Following this approach will develop multimodal experiences for your users. You will be able to deliver successful products that earn trust, fulfill needs, and inspire delight.

Who Should Read This Book

This book is for designers who are developing or transforming products with new interface modes, and those who want to. It will extend knowledge, skills, and process past screen-based appraoches, and into the next wave of devices and technologies. The book also helps teams that are integrating products across broader physical environments and behaviors. The senses and cognition are the foundation of all human experience, and understanding them will help blend physical and digital contexts and activities successfully. Ultimately, it is for anyone who wants to create better products and services. Our senses are the gateway to the richness, variety, delight, and meaning in our lives. Knowing how they work is key to delivering great experiences.

How This Book Is Organized

This book is organized into two parts. Part I covers the human sensory abilities, how they function, and how we use them to interact with both the physical world and with technology. It also describes the ways technology fits with human senses in new interface modes. Part II sets out the flexible process and methodology of multimodal design. Starting with product definition, it explains how to identify and assess possibilities for innovation. From there, it describes the considerations, activities, and deliverables that take a team from concept to launch. Sprinkled throughout the book are short sections about relevant products and technologies.

Part I: New Human Factors

  • Chapter 1 describes how sensing, whether by humans or devices, turns physical events into useful information. It describes modalities and multimodalities and how they shape human experience. It describes the difference between human modalities and device modes and how together they become interfaces. Finally, it looks at the new human factors: sensing, understanding, deciding, and acting—important experiential building blocks for designing any kind of product or service.

  • Chapter 2 delves further into the building blocks of experience and how they relate to more familiar design concepts like affordances and mental models. The chapter looks at how they are useful for understanding human experience and how they are applicable to multimodal design.

  • Chapter 3 looks at how our senses evolved to perceive the diverse types of matter and energy in the world around us. Designing interfaces requires an understanding of the user’s senses and the powers, limitations, characteristics, and expectations that each sense carries.

  • Chapter 4 is about how cognition is organized by schemas, which lets us parse and analyze sensory information and then understand and learn from it.

  • Chapter 5 is about how our physical form and abilities shape our interactions, and the considerations they raise in product design.

  • Chapter 6 digs into modalities and multimodalities: specific patterns of perception, cognition, and action that enable our behaviors. It introduces a few rules of thumb for designers creating multimodal interactions such as respecting cognitive load, supporting focus, maintaining flow, and allowing feedback and validation.

Part II: Multimodal Design

  • Chapter 7 explains how to identify opportunities for innovation by assessing user needs and contexts and reframing current products and technologies.

  • Chapter 8 looks at cues, affordances, feedback, feedforward, and prompts: the palette of multimodal interactions. These elements of experience describe how people use physical information within an interaction.

  • Chapter 9 explores ways to use maps and models to frame opportunities, contextualize insights, and align project efforts to create effective multimodal experiences. It builds on existing deliverables like customer journeys and ecosystem maps, and introduces new ones like focus models.

  • Chapter 10 describes the interplay of physical design and technology capabilities during product development. It emphasizes the need to map interface modes to the required modalities within user behaviors, and compare different mappings across products.

  • Chapter 11 distinguishes layers of context as ecosystems and looks at how they affect product usage. It includes four types of ecosystems: information, physical, social, and device.

  • Chapter 12 encourages designers to think of the entire design process as prototyping. It describes deliverables that can be used to specify product characteristics and usage behaviors.

  • Chapter 13 describes different ways that products can be released and identifies how teams can minimize risk, maximize learning, and increase the chances of a successful product.

Why Write a Book About Multimodal Design

In the last few years, the development of interface technologies has accelerated, causing an explosion of products and the Internet of Things. Many new subdisciplines in design have emerged as a result: gestural interfaces like Kinect and Leap Motion, voice user interfaces (VUIs) like Siri and Amazon Echo, and virtual reality (VR) and augmented reality (AR) products like the Oculus Rift and Apple’s ARKit. The rising use of sensors has also enabled automated interactions: wearables can detect whether you are biking or walking, doors can unlock based on proximity, the Nest thermometer can set your home to a comfortable temperature—not to mention driverless cars and consumer robotics. Designers and product teams are challenged to create all kinds of new experiences with these technologies. This has largely been following a technology-centric approach to creating experiences, driven by whatever the new technology can do.

Design, however, works best as a user-centered discipline, focused on what people can do. And the one facet of user experience that connects all these new types of interactions is the senses. It turns out that sensory experience is key to human experience. For all the bits in the world, we can only understand the information they hold when they are translated to physical information: a vibration we can feel with our skin, a sound we hear with our ears, or a pixel we see with our eyes. Our lives are rooted in our senses. We enjoy a dazzling sunset, beautiful music, and delicious food. We see the smile on the face of a loved one. We smile back to share our happiness. We experience joy and empathy through our senses. We understand the way the world works through them as well. If we are called user experience designers, then we should understand how people experience in the first place. And human experience starts with the senses.

Acknowledgments

Apparently it takes a village to write a book. It’s impossible to name everyone who offered insight, ideas, and encouragement along the way. We’d like to especially thank Wes Yun, Cecil Odom, Mark Blanchard, and Kelly Goto for riffing with us—and helping us tell the difference between crazy good ideas and crazy bad ideas.

We’d like to thank our reviewers Frances Close, and Christy Ennis-Kloote for their patience and clarity.

Christine would like to thank her grandmother, Chung Ja Kim, and aunt, Sanghee Hui, whose strength and intelligence have provided a lifetime of inspiration, and who make her laugh so hard she cries.

John would like to thank his mother Barbara for overall gumption and joie de vivre, and Paul Cuneo for being so supportive.

We’d both like to thank the extended O’Reilly team including our editors, Angela Rufino, Melanie Yarbrough, Jasmine Kwityn, and Mary Treseler; and our agent Peter McGuigan.

Very special thanks to our dog Barnaby who kept our feet warm while we were writing and reminded us to go outside.

Get Designing Across Senses now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.