Monday, April 28, 2008

Computers for the people

Designing that ideal mobile interface ...

"Designing a user interface for a mobile computer isn't hard; all you have to do is think like a person.

Sounds simple, but it's taken a long time for that realization to set in, said Stu Card, manager of the user interface group at the famed Palo Alto Research Center. Card joined fellow researcher Ted Selker of MIT's Media Lab at Sofcon 2008 to discuss human interfaces for mobile computers, and just how differently engineers have to treat these devices than their older PC brothers.

PCs weren't necessarily designed for end users in the early days. They were designed for developers to create applications, or corporations to make their workers more productive. But mobile computers, whether they are smartphones, mobile Internet devices, or whatever, are fundamentally different; they're with us at all times and are used on the go, not as stationary, sedentary terminals. And they are used as social devices, whether that's planning a get-together with friends, taking pictures at the party, or as the ultimate arbiter of extremely important barroom arguments such as who had the most home runs for the 1993 New York Mets (Bobby Bonilla).

Card focused on the look and feel of the software that accompanies smartphones. He used Apple's iPhone as his example, and examined how the iPhone was designed according to four different human factors: social, rational, cognitive, and biological. The different factors represent the amount of time one spends on a task or problem; you might take a second to page through a library of pictures, but spend months or years developing a network of friends.

"Mobile computing is much more intimately tied to a user's life. You need to design simultaneously on at least four levels, and functional design is not the only requirement," Card said.

Apple made the breakthrough it did with the iPhone because it came up with ways of interacting with the device that make sense on biological and cognitive levels, Card said. Translated, that means the iPhone plays well to natural perceptual and motor skills, as well as our desire for immediacy.

For example, the notion of finger gestures as the primary control is much more intuitive than navigating through a series of menus, and makes the device more intimate. And Apple's groundbreaking decision to put the browser first and the keypad second makes browsing much easier and compelling than other mobile devices.

As you move to the higher levels of mobile computing--the rational (problem-solving) and social (in short, event planning)--the computer itself takes on the role of a sensor, Selker said. "(It's about) using sensors and virtual sensors to understand and respect human intention."

Selker created the ThinkPad's TrackPoint for IBM, and has been working on human-facing design for years. Right now, he's working on adding sensors and computing capabilities to all kinds of commonly used devices, from bike helmets to toy pets.

The idea is that anything can be a sensor, and anything can take input from the world and provide feedback to the user. This sounds like a key part of the future development of mobile phones, where phones change from two-way voice and data communications devices to capture and analyze all kinds of data, such as location, weather, and even mood."    (Continued via CNET)    [Usability Resources]

Interface that can recognize his mood based on his eye movement, and react accordingly. - Usability, User Interface Design

Interface that can recognize his mood based on his eye movement, and react accordingly.


Post a Comment

<< Home

<< Home