"Although its individual features weren't new, the Mac offered integration, the expectation of a GUI, and interface consistency.
The Macintosh was introduced January 24, 1984. In fact, the Mac was originally manufactured in the Fremont, California building that now houses Nielsen Norman Group.
The Mac didn't pioneer any individual user-interface innovation. Its most prominent feature, the mouse, had been invented by Doug Engelbart in 1968. That the mouse took 16 years to move from the lab to popular use is a striking example of how slowly things move in the tech business — particularly when it comes to getting diverging designs into widespread use.
(Admittedly, the original mouse was not especially appealing: As I have experienced first-hand, the initial model was a heavy brick with an awkward-to-push button.)
The Mac's graphical user interface — characterized by windows, icons, menus, and a user-controlled pointer (that is, WIMP) — was also not new.
Before the Mac, my GUI projects used a PERQ workstation from Three Rivers Computer Corporation. Among other things, we conducted user testing to find the best mental model for controlling the display when there was more information than a single screen could hold. Our findings? To view additional content in a long document, people think of a "down" operation, so a downward-pointing arrow is the best choice. This is unlikely to surprise today's scrollbar users, but because the screen image actually moves up when users scroll toward the end of a document, the study's outcome wasn't obvious in advance.
GUI guidelines are now well established, and modern application designers can simply follow existing best practices. But all these guidelines had to be discovered through early experiments with graphical interactions. This early UI research happened at PARC and other places; some of it even at Apple, in the Lisa project.
Going beyond such research, the Mac offered 3 breakthroughs:
* The features were integrated: Users got them all in one package, rather than having to accumulate far-flung innovations. This was a case where the whole was much greater than the sum of its previously scattered parts.
* The GUI was the platform's expected foundation, rather than an optional add-on. In fact, early Macs didn't even have cursor keys, so applications had to be mouse-driven — and a mouse shipped as standard with every Mac. Although users could buy mice for many other computers (Microsoft's mouse was launched the year before the Mac), most of their apps remained character-based for years because the GUI wasn't the expected UI and designers couldn't rely on users having a mouse.
* It created a human-interface standard that independent software vendors had to follow in order to have their applications deemed "Mac-like." Because the resulting consistency reduced the learning burden for new applications, users were willing to buy more software. And indeed, Mac users purchased about two applications more per computer than DOS users did.
As is often the case, pure innovation was less important than making the new stuff work well.
Triumph or Defeat for Usability?
During its first decade, the Mac offered clearly superior usability compared to competing personal computer platforms (DOS, Windows, OS/2). Not until Windows 95 did the PC start to approximate Mac-level usability.
Despite this Mac advantage, PCs have sold vastly better in every single year since 1984, and the Mac has yet to exceed a single-digit market share.
The Mac's miserable marketplace performance seems to pose a strong argument against usability. Why bother, if it doesn't sell?
The counter-argument is that usability is the only reason Mac survived. Compared to the PC, it was much more expensive, had only a fraction of the specialized applications, and was cursed by Apple's business-hostile attitude.
So why would anyone pay more for less? Because Macs were easier to use.
Even so, the Mac's modest commercial success emphasizes the importance of the total user experience: The PC had more specialized applications and a broader support ecosystem. Price also matters tremendously. In the 1980s, Macs were more expensive (and thus sold less) because of their fancy high-resolution display.
Today, there's no conflict between cost and usability. For websites, it's often cheaper to design for higher usability, because it emphasizes simplicity and interaction standards over bloat and made-up menu items and dialogue elements. For software applications, it often costs the same amount to design something good or something bad. (And usability's ROI is high, especially for websites, which users simply leave if the UI is difficult.)
For sure, given usability research's laughably low cost relative to any serious development budget, there's no excuse not to find out what works for your customers. Once you know, it usually doesn't cost any more to implement usability findings than it does to invent something that won't work as well. And, because usability integrates just fine with Agile development methods, it won't delay your launch, either." (Continued via Jakob Nielsen's Alertbox) [Usability Resources]