AN OPEN LETTER TO JAKOB NIELSEN
[For those not subscribing to CACM, Jakob Nielsen and I have come down
on opposite sides of a usability debate. Jakob believes that the
prevalence of bad design on the Web is an indication that the current
method of designing Web sites is not working and should be replaced or
augmented with a single set of design conventions. I believe that the
Web is an adaptive system and that the prevalence of bad design is how
evolving systems work.
Jakob's ideas are laid out in "User Interface Directions for the Web",
CACM, January 1999.
My ideas are laid out in "View Source... Lessons from the Web's
Massively Parallel Development", networker, December 1998, and
Further dialoge between the two of us is in the Letters section of the
March 1999 CACM.]
I read your response to my CACM letter with great interest, and while
I still disagree, I think I better understand the disagreement, and
will try to set out my side of the argument in this letter. Let me
preface all of this by noting what we agree on: the Web is host to
some hideous dreck; things would be better for users if Web designers
made usability more of a priority; and there are some basics of
interface usability that one violates at one's peril.
Where we disagree, however, is on both attitude and method - for you,
every Web site is a piece of software first and foremost, and
therefore in need of a uniform set of UI conventions, while for me, a
Web site's function is something only determined by its designers and
users - function is as function does. I think it presumptous to force
a third party into that equation, no matter how much more "efficient"
that would make things.
You despair of any systemic fix for poor design and so want some sort
of enforcement mechanism for these external standards. I believe that
the Web is an adaptive system, and that what you deride as "Digital
Darwinism" is what I would call a "Market for Quality". Most
importantly, I believe that a market for quality is in fact the
correct solution for creating steady improvements in the Web's
Let me quickly address the least interesting objection to your idea:
it is unworkable. Your plan requires both centralization and force of
a sort it is impossible to acheive on the Internet. You say
"...to ensure interaction consistency across all sites it will be
necessary to promote a single set of design conventions."
"...the main problem lies in getting Web sites to actually obey any
but you never address who you are proposing to put in the driver's
seat - "it will be necessary" for whom? "[T]he main problem" is a
problem for whom? Not for me - I am relieved that there is no
authority who can make web site designers "obey" anything other httpd
header validity. That strikes me as the Web's saving grace. With the
Web poised to go from 4 million sites to 100 million in the next few
years, as you note in your article, the idea of enforcing usability
rules will never get past the "thought experiment" stage.
However, as you are not merely a man of action but also a theorist, I
want to address why I think enforced conformity to usability standards
is wrong, even in theory. My objections break out into three rough
categories: creating a market for usability is better than central
standards for reasons of efficency, innovation, and morality.
In your letter, you say "Why go all the way to shipping products only
to have to throw away 99% of the work?" I assume that you meant this
as a rhetorical question - after all, how could anybody be stupid
enough to suggest that a 1% solution is good enough? The Nielsen
Solution - redesign for everybody not presently complying with a
"single set of design conventions" - takes care of 100% of the
problem, while the Shirky Solution, let's call it evolutionary
progress for the top 1% of sites, well what could that possibly get
1% gets you a surprising amount, actually, if it's properly arranged.
If only the top 1% most trafficked Web sites make usability a
priority, those sites would nevertheless account for 70% of all Web
traffic. You will recognize this as your own conclusion, of course, as
you have suggested on UseIt (http://www.useit.com/alertbox/9704b.html)
that Web site traffic is roughly a Zipf distribution, where the
thousandth most popular site only sees 1/1000th the traffic of the
most popular site. This in turn means that a very small percentage of
the Web gets the lion's share of the traffic. If you are right, then
you do not need good design on 70% of the web sites to cover 70% of
user traffic, you only need good design on the top 1% of web sites to
reach 70% of the traffic.
By ignoring the mass of low traffic sites and instead concentrating on
making the popular sites more usable and the usable sites more
popular, a market for quality is a more efficient way of improving the
Web than trying to raise quality across the board without regard to
A market for usability is also better for fostering innovation. As I
argue in "View Source...", good tools let designers do stupid
things. This saves overhead on the design of the tools, since they
only need to concern themselves with structural validity, and
can avoid building in complex heuristics of quality or style. In
HTML's case, if it renders, it's right, even if it's blinking yellow
text on a leopard-skin background. (This is roughly the equivalent of
letting the reference C compiler settle arguments about syntax - if it
compiles, it's correct.)
Consider the use of HTML headers and tables as layout tools. When
these practices appeared, in 1994 and 1995 respectively, they
infuriated partisans of the SGML 'descriptive language' camp who
insisted that HTML documents should contain only semantic descriptions
and remain absolutely mute about layout. This in turn led to white-hot
flamefests about how HTML 'should' and 'shouldn't' be used.
It seems obvious from the hindsight of 1999, but it is worth
repeating: Everyone who argued that HTML shouldn't be used as a layout
language was wrong. The narrowly correct answer, that SGML was
designed as a semantic language, lost out to the need of designers
to work visually, and they were able to override partisan notions of
correctness to get there. The wrong answer from a standards point of
view was nevertheless the right thing to do.
Enforcing any set of rules limits the universe of possibilities, no
matter how well intentioned or universal those rules seem. Rules which
raise the average quality by limiting the worst excesses risk ruling
out the most innovative experiments as well by insisting on a set of
givens. Letting the market separate good from bad leaves the door open
to these innovations.
This is the most serious objection to your suggestion that standards
of usability should be enforced. A web site is an implicit contract
between two and only two parties - designer and user. No one - not
you, not Don Norman, not anyone, has any right to enter into that
contract without being invited in, no matter how valuable you think
your contribution might be.
IN PRAISE OF EVOLVABLE SYSTEMS, REDUX
I believe that the Web is already a market for quality - switching
costs are low, word of mouth effects are both large and swift, and
redesign is relatively painless compared to most software interfaces.
If I design a usable site, I will get more repeat business than if I
don't. If my competitor launches a more usable site, it's only a click
away. No one who has seen the development of Barnes and Noble and
Amazon or Travelocity and Expedia can doubt that competition helps
keep sites focussed on improving usability. Nevertheless, as I am a
man of action and not just a theorist, I am going to suggest a
practical way to improve the workings of this market for usability -
lets call it usable.lycos.com.
The way to allocate resources efficently in a market with many sellers
(sites) and many buyers (users) is competition, not standards. Other
things being equal, users will prefer a more usable site over its less
usable competition. Meanwhile, site owners prefer more traffic to
less, and more repeat visits to fewer. Imagine a search engine that
weighted usability in its rankings, where users knew that a good way
to find a usable site was by checking the "Weight Results by
Usability" box and owners knew that a site could rise in the list by
offering a good user experience. In this environment, the premium for
good UI would align the interests of buyers and sellers around
increasing quality. There is no Commisar of Web Design here, no
International Bureau of Markup Standards, just an implicit and ongoing
compact between users and designers that improvement will be rewarded.
The same effect could be created in other ways - a Nielsen/Norman
"Seal of Approval", a "Usability" category at the various Web awards
ceremonies, a "Usable Web Ring". As anyone who has seen "Hamster
Dance" or an emailed list of office jokes can tell you, the net is the
most efficent medium the world has ever known for turning user
preference into widespread awareness. Improving the market for quality
simply harnesses that effect.
Web environments like usable.lycos.com, with all parties maximizing
preferences, will be more efficent and less innovation-dampening than
the centralized control which would be necessary to enforce a single
set of standards. Furthermore, the virtues of such a decentralized
system mirrors the virtues of the Internet itself rather than fighting
them. I once did a usability analysis on a commerce site which had
fairly ugly graphics but a good UI nevertheless. When I queried the
site's owner about his design process, he said "I didn't know anything
when I started out, so I just put up a site with an email link on
every page, and my customers would mail me suggestions."
The Web is a marvelous thing, as is. There is a dream dreamed by
engineers and designers everywhere that they will someday be put in
charge, and that their rigorous vision for the world will finally
overcome the mediocrity around them once and for all. Resist this idea
- the world does not work that way, and the dream of centralized
control is only pleasant for the dreamer. The Internet's ability to be
adapted slowly, imperfectly, and in many conflicting directions all at
once is precisely what makes it so powerful, and the Web has taken
those advantages and opened them up to people who don't know source
code from bar code by creating a simple interface design language.
The obvious short term effect of this has been the creation of an ocean
of bad design, but the long term effects will be different - over time
bad sites die and good sites get better, so while those short-term
advantages seem tempting, we would do well to remember that there is
rarely any profit in betting against the power of the marketplace in
the long haul.