Skip to main content

Folk Ergonomics: or, it all Fitts


When I was a poor civil servant (plus ça change) and part-time graduate student I longed to own a Mac. I’d read everything available about them, and nothing I read did anything to dissuade me. What I wanted, as well as the crisp, typographic display and the integration between the applications, was the windowing system. I already knew, somehow, that a Proper Computer™ would be able to show more than one program on the screen at once and let you move between them. Lacking the funds to buy the current model Apple was offering, a Mac SE, I made do with an Atari ST 520 STFM. This was a strange machine with a dual personality:  a games machine with aspirations to being a workstation, and which used the non-broken version of Digital Research’s GEM windowing environment. This system had been shamelessly copied from the Mac interface, so much so that Apple sued Digital Research and got an injunction that prevented later versions of GEM (used on DOS machines, notably those from Amstrad) from using overlapping windows, drive icons on the desktop or other elements of Mac window manager furniture. Apple didn’t cite Atari in its suit, so on the ST machines GEM still kept its original, highly Mac-like functionality. It didn’t multi-task, but even so one program could show more than one window at once. On the desktop, it had multiple resizable and overlapping windows onto the file system, just like the Mac. It had WYSIWYG screen display and printing, a menu bar across the top of the screen, a crisp monochrome display, and was so much like a cut-price Mac that it was sometimes called - after Jack Tramiel, the Atari founder - the Jackintosh.
Later, when I was working in a large accounting firm I was issued a Compaq Deskpro 386 and acquired a copy of Windows/386. This had the ability to run several Windows applications at once (even though they were painfully ugly and crude by later standards) and to multitask DOS applications, all in separate windows which could overlap each other and be moved around and resized any way you wanted. Then came Windows 3.0, and all the versions after. When I worked at the University of Cambridge I spent lots of time using UNIX computers, using X Windows and a variety of desktop managers - twm, the basic one, Motif, the Windows-like one, and my then favorite, fvwm, which allowed you to use multiple Motif desktops on the same screen, and flick between them with a simple keyboard command. Along the way I picked up some more powerful Macs at home too, which ran System 6 and System 7, which almost allowed proper multi-tasking of applications and their windows. 
All this time screens were getting bigger and higher in resolution, meaning the area available to run lots of programs at once was increasing. And there was a natural hierarchy of attention which translated reasonably well into the vertical stacking order and screen space alloted to programs and their windows. The main thing you were working on was at the top and had the largest window, such as a text editor or word processor or spreadsheet; then behind that and off to one side would be your mail application so you could see new mail arrive and scan its sender and subject without pulling its window to the front, and decide if you needed to read it right now or carry on working. Over on one side in a tiny window would be a chat program, either instant messaging or an IRC client, sized so you could keep an eye on the conversation without having to focus exclusively on it. A couple of terminal windows for fiddling with stuff on remote servers, a call tracking system way in the background or even relegated to another virtual screen, and the odd utility all were shuffled in a loose stack, arranged so each got the screen space and attention it merited and no more.
This is the way I have worked for well over a decade, maybe more, and so did all my peers. It just seemed obvious, intuitive and natural. But when it comes to the mass use of GUIs, we’re the minority.
When users began to get Windows on the desktop, the screens were pretty low resolution - 640 pixels across by 480 high. There wasn’t really room to get more than one application’s window on the screen at once. Even the next generation of screens that had 800 pixels by 600 were only good for showing more of your word processing document or your spreadsheet. So it was only natural that most users reacted to multi-tasking, windowing operating systems by using them as full-screen task switchers like the short-lived DOS application managers like TopView and DesqView that preceded Windows.
Since TFT display technology became affordable, with transistor-based panels replacing cathode ray tubes on the desktop, users’ screens have followed the trend (described by Gordon Moore, co-founder of Intel) of geometrically increasing transistor density for least cost. This means that a single-task user such as a word processing worker now commands a screen of maybe 1600 by 1200 pixels, which is overkill even if you turn on all of Microsoft Word’s toolbars, palettes and floating interface gadgets. To me, it’s absolutely crazy to use all that area - two megapixels - to display just one application’s window. But that’s exactly what most users do.
That’s not to say that most users just use one application, zoomed out to full screen size. They use many applications, but each one’s window is maximized. Typically they user the Windows Taskbar to track the open windows, and switch between them by pressing  each window’s Taskbar button with the mouse. This produces absurd situations, for example: a user composing a short e-mail in Outlook or similar program finds the whole text of their message becomes one very long line going from one side of the enormous message window to another. That’s not a comfortable way to write, especially if you review it before you send it, and if the recipient is reading mail full-screen too it’s equally difficult to read. I’ve seen so many people, smart people, do this that it can’t  be wholly due to lack of training, or incuriosity about the possibilities of the user interface. There must be a benefit, but for the longest time I had no idea what it might be. Then the other day it came to me in a 4 AM epiphany (I live high up on the Upper Upper West Side of Manhattan, which is very noisy indeed. Being woken up at 4AM is normal). In order to explain what I realized, I have to talk about a bit of Human-Computer Interaction (HCI) science.
With command-line driven systems, ergonomics is all about making the commands memorable, short and consistent in their syntax and behavior. With GUI systems, it’s about making elements of the interface obvious, easily located and putting them in the best place for users to find and use them. A very important model for studying this field is Fitts’s law, which predicts the time needed to move to a target area on the screen as a function of the distance to the target and the target’s size. The actual formula for the law is up at the top of this article. The bottom line for our purposes today is that if you make targets wider, people find them easier to hit. A corollary of Fitts’s Law Is that things you want to click on should be sized proportionately with their importance or frequency of use. 
This doesn’t, at first glance, seem to have any relevance to the reason Windows users maximize their windows. There is one benefit to maximizing the window that doesn’t have any connection to Fitts’s Law, but is connected with how we learn to do things. Tasks involving hand-eye coordination are easier to learn if the target is in the same place every single time. Recall that every window on the Windows desktop has its own menu bar, at the top of each window. In the “Power User” Windows desktop approach, with its overlapping, variously sized windows, the user has to accurately hit the menu item they want in each individual window. Power Users don’t tend to find this difficult because a) over many years of using GUIs they’ve become horribly adept at accurately slinging the mouse anywhere at all on the screen with pinpoint precision, and b) they make enormous use of keyboard shortcuts instead of menus to accomplish the same tasks a casual user handles with the mouse. However, casual users tend to be mouse-centric, either because they were only trained to use the mouse and menus, or because that’s the primary way to explore the system (experience of earlier computer systems has scared them off experimentally pressing keys to learn what happens). Let a casual user use your Power User Windows or Motif desktop and they’ll uncomfortably hunt around the menus, missing the one they want, until you let them click the Maximize button. As soon as they do this, the ubiquitous File and Edit menus, where 80% of the work gets done in most users’ computing lives, jump to the same location they have in every single window. Now the user can nail them with ease and fluency.
The relevance of Fitts’s Law to maximized windows is this. I said above that things should be wide in proportion to how much they get used. On a Power User desktop, the menus are always pretty narrow and hard for a non-expert to hit reliably. Maximizing the window doesn’t make the actual physical size of the menus any greater. However, the edges of the screen are treated specially by most GUIs. Assuming you only have one screen, if you flick the mouse up so the mouse pointer goes to the top of the screen, and keep shoving the mouse upwards even though the pointer is already at the top, the pointer stays where it is. This makes the menu bar semi-infinitely wide. You can dispense with fine motor control and slow precise movements when you’re aiming for the top of the screen. You want to save this file quickly? You can whack the mouse up to the File menu, pull it down and hit Save. The speed at which you can quickly acquire the File menu makes a great difference to the perceived ease of use. 
This is exactly why the Macintosh user interface has always had only one menu bar, no matter how many applications may be running. The early Mac OS designers realized consistency of positioning and hospitality to overshooting the mouse were going to be crucial in a usable GUI interface. One of the better-known interface engineers who worked at Apple for a long time, Bruce “Tog” Tognazzini, has written extensively about the use of Fitts’s Law in the design of the Mac UI.  
What’s quite wonderful is that I don’t think users are consciously choosing to maximize the windows in order to give themselves this consistency and hospitality. It’s something they’ve done intuitively, the spatial and visual parts of their brain prodding the consciousness into giving the poor overworked centers of proprioception and coordination a break. Windows users are unconsciously adding back to Windows one of the important elements of the Mac UI that Microsoft failed to copy. 

Comments

Anonymous said…
Leaving aside the fact that for 'all Windows users' you may mean to write 'all Windows users who aren't power users', this should logically make you a fan of the menu-less approach of the Office 2007 ribbon...
Simon said…
@marypcb: I don't seem to have written the phrase you quote.

re Office 2007: I do partially applaud it, yes, for the Fittsiness. I can't comment on its use in an everyday situation because MS didn't include it in the one Windows app I use often (Project). Naturally they haven't ported that interface style to Office 2008 as that would be an incredible collision of UI grammars.

Popular posts from this blog

Getting It Done?

As someone who has worked at many jobs in which it is part of the job description to be interrupted incessantly, whilst at the same time having, as part of the same job, work that needs careful planning, reflection and sustained concentration to execute correctly, I've had many problems with the usual time-management approaches. Most of them seem to have been conceived in some middle-management Utopia back in the '50s, a place where everyone has an office with a door, the closure of which was sacrosanct; a time before email, pagers, cellphones and Blackberries; a society where "getting up in someone's face" was a social crime rather than a standard business strategy. That said, you can't let your life be run by random events and dropping everything to work on whatever the customer who shouts the loudest wants. You need some sort of system, and I've found the system popularized by David Allen, which he calls by the arcane, obscure name of Getting Things Don