Monday, April 06, 2009

History of Human Computer Interaction (HCI)

Jack Caroll on the history of HCI ...

"Human-computer interaction (HCI) is an area of research and practice that emerged in the early 1980s, initially as a specialty area in computer science. HCI has expanded rapidly and steadily for three decades, attracting professionals from many other disciplines and incorporating diverse concepts and approaches. To a considerable extent, HCI now aggregates a collection of semi-distinct fields of research and practice in human-centered informatics. However, the continuing synthesis of disparate conceptions and approaches to science and practice in HCI has produced a dramatic example of how different epistemologies and paradigms can be reconciled and integrated.

Where HCI came from

Until the late 1970s, the only humans who interacted with computers were information technology professionals and dedicated hobbyists. This changed disruptively with the emergence of personal computing around 1980. Personal computing, including both personal software (productivity applications, such as text editors and spreadsheets, and interactive computer games) and personal computer platforms (operating systems, programming languages, and hardware), made everyone in the developed world a potential computer user, and vividly highlighted the deficiencies of computers with respect to usability for those who wanted to use computers as tools.

The challenge of personal computing became manifest at an opportune time. The broad project of cognitive science, which incorporated cognitive psychology, artificial intelligence, linguistics, cognitive anthropology, and the philosophy of mind, had formed at the end of the 1970s. Part of the programme of cognitive science was to articulate systematic and scientifically-informed applications to be known as "cognitive engineering". Thus, at just the point when personal computing presented the practical need for HCI, cognitive science presented people, concepts, skills, and a vision for addressing such needs. HCI was one of the first examples of cognitive engineering.

Other historically fortuitous developments contributed to establishment of HCI. Software engineering, mired in unmanageable software complexity in the 1970s, was starting to focus on nonfunctional requirements, including usability and maintainability, and on non-linear software development processes that relied heavily on testing. Computer graphics and information retrieval had emerged in the 1970s, and rapidly came to recognize that interactive systems were the key to progressing beyond early achievements. All these threads of development in computer science pointed to the same conclusion: The way forward for computing entailed understanding and better empowering users.

Finally human factors engineering, which had developed many techniques for empirical analysis of human-system interactions in so-called control domains such as aviation and manufacturing, came to see HCI as a valuable and challenging domain in which human operators regularly exerted greater problem-solving discretion. These forces of need and opportunity converged around 1980, focusing a huge burst of human energy, and creating a highly visible interdisciplinary project.
From cabal to community

The original and abiding technical focus of HCI is on the concept of usability. This concept was originally articulated naively in the slogan "easy to learn, easy to use". The blunt simplicity of this conceptualization gave HCI an edgy and prominent identity in computing. It served to hold the field together, and to help it influence computer science and technology development more broadly and effectively. However, inside HCI the concept of usability has been reconstructed continually, and has become increasingly rich and intriguingly problematic. Usability now often subsumes qualities like fun, well-being, collective efficacy, aesthetic tension, enhanced creativity, support for human development, and many others. A more dynamic view of usability is that of a programmatic objective that should continue to develop as our ability to reach further toward it improves.

Although the original academic home for HCI was computer science, and its original focus was on personal productivity applications, mainly text editing and spreadsheets, the field has constantly diversified and outgrown all boundaries. It quickly expanded to encompass visualization, information systems, collaborative systems, the system development process, and many areas of design. HCI is taught now in many departments/faculties that address information technology, including psychology, design, communication studies, cognitive science, information science, science and technology studies, geographical sciences, management information systems, and industrial, manufacturing, and systems engineering. HCI research and practice draws upon and integrates all of these perspectives.

A result of this growth is that HCI is now less singularly focused with respect to core concepts and methods, problem areas and assumptions about infrastructures, applications, and types of users. Indeed, it no longer makes sense to regard HCI as a specialty of computer science; HCI has grown to be broader, larger and much more diverse than computer science. It expanded from individual and generic user behavior to include social and organizational computing, creativity, and accessibility for the elderly, the cognitively impaired, and for all people. It expanded from desktop office applications to include games, e-learning, e-commerce, military systems, and process control. It expanded from early graphical user interfaces to include myriad interaction techniques and devices, multi-modal interactions, and host of emerging ubiquitous, handheld and context-aware interactions.

There is no unified concept of an HCI professional. In the 1980s, people often contrasts the cognitive science side of HCI with the software tools and user interface side of HCI. The HCI landscape is far more differentiated and complex now. HCI academic programs train many different types of professionals now: user experience designers, interaction designers, user interface designers, application designers, usability engineers, user interface developers, application developers, technical communicators/online information designers, and more. And indeed, many of the sub-communities of HCI are themselves quite diverse. For example, ubiquitous computing (aka ubicomp) is subarea of HCI, but it is also a superordinate area integrating several mutually diverse subareas (e.g., mobile computing, geo-spatial information systems, in-vehicle systems, community informatics, distributed systems, handhelds, wearable devices, ambient intelligence, sensor networks, and specialized views of usability evaluation, programming tools and techniques, application infrastructures, etc.). The relationship between ubiquitous computing and HCI is becoming paradigmatic: HCI is the name for a community of communities.

In the early 1980s, HCI was a small and focused specialty area. It was a cabal trying to establish what was then a heretical view of computing. Today, largely due to the success of that endeavor, HCI is a vast and multifaceted community, loosely bound by the evolving concept of usability, and the integrating commitment to value human concerns as the primary consideration in creating interactive systems."    (Continued via Interaction-design.org, Jack Carroll, InfoDesign, User Experience Network)    [Usability Resources]

0 Comments:

Post a Comment

<< Home

<< Home
.