Tuesday, February 26, 2008

Applied Empathy: A Design Framework for Human Needs and Desires

Building empathy into UX ...

"Part Three: Real-World Applications

Part One of this series, Applied Empathy, introduced a design framework for meeting human needs and desires and defined five States of Being that represent the different degrees to which products and experiences affect and motivate people in their lives. Part Two explained the three Dimensions of Human Behavior and outlined a variety of specific needs and desires for which we can intentionally design products. This third and final part of the series shows how this design framework maps to a variety of well-known products and experiences and illustrates how this framework can be put to practical use.
Mapping the Framework to Digital Products

It is no accident that user experience and experience design originated with and matured from software development: It is only through truly digital products and experiences that we can satisfy all three Dimensions of Human Behavior, both deeply and simultaneously. Software has a unique ability to incorporate both analytical and emotional hooks into virtually any physical activity, in a way that is typically difficult—and often even impossible—in the analog world. It helps account for both the tremendous financial success and the cultural growth of computing lifestyles since the mainstreaming of the personal computer, which was greatly accelerated by the invention and subsequent ubiquity of the Internet. Digital technology has unlocked the potential of this intriguing triangulation of the Analytical, Emotional, and Physical—in the human condition, never before satisfied so fully—which explains why the most celebrated and successful products in recent years tend to skew toward the digital realm. For this reason, I will use two popular digital products as mapping examples.

Let’s start with the Nintendo Wii. A surprise hit when Nintendo released it in the fall of 2006, the Wii introduced new human/computer interactions in the context of a video game system. Previously, most video games required people to move a joystick to create movement and push buttons to initiate action. These modes of input mapped to game metaphors that had no real-world connection to a joystick and buttons, such as running and jumping. The Wii, using new technology, transformed the joystick-and-buttons paradigm into an interaction model in which players can behave in approximately the same ways in playing games as they would if they were engaging in the same activity in real life.

The best-known examples are from the Wii Sports game, which the game system includes. When bowling, the game allows—and even implicitly encourages—players to walk up and approach the lane, then throw their arms as they would if they were actually bowling. These behaviors are then reflected on the actual game screen. While children and hard-core gamers quickly figured out that all players actually need to do is just flick their wrists in the correct way—the literal mapping of exact physical movements to their corresponding performance on the screen is not necessary—the game nonetheless removes the inhuman interface from the game. Bowling on the Wii can play remarkably like bowling in a real-life bowling alley. The result of this innovation is that the Wii has transformed Nintendo from a moribund kiddie video game platform that was deeply in the shadow of Sony and Microsoft into the runaway hit of 2007. The Wii is such a popular product that—more than a year since its release—retailers still have difficulty keeping it in stock."    (Continued via UXmatters)    [Usability Resources]

0 Comments:

Post a Comment

<< Home

<< Home
.