As early as 1100 BC humans tried to ease their life
by using counting devices such as the Abacus. Later on slide rules became
common until mechanical calculators were introduced in the 17th
century. In 1834 Charles Babbage's (1792-1871) invented the programmable
calculator "Analytical Engine". In 1941 Konrad Zuse (1910-1995)
completed the first electromagnetic computer based on relay technology: Z3.
During the second world war, the military complex's need to have exact
ballistic calculations at hand pushed the development of computers. Until the
late 1940s, only a few computers were shipped entirely to official agencies and
for military purposes, e.g. the 'Atlas' - a giant of 11m length and 6m width -
to the National Security Agency.
Figure 1: The U.S. Census Bureau was the first client to order a mainframe
electronic computer.
© U.S. Census Bureau
Back then the modern computer was utilised to take
the burden of vast computations from the human. The first commercially
available general purpose computer for civilian-use (UNIVAC: Universal
Automatic Computer) was shipped to the Census Bureau for the US population
census of 1950 (see Figure 1). The application was not directly visible to the
general public but it underlined the potential of computers to the business
world. It was operated by highly trained technicians on the system level by
directly setting the electric circuits.
During the fifties and sixties computers became
more advanced and technologies like operating systems, transistors, integrated
circuits, programming languages and hard disks were invented. As before
computers were still very huge, only affordable for large companies as the
computing time was expensive. Handling these colossuses were rather crude -
punch cards or punched tapes were the
means for input and output. Therefore for end-users, let us assume scientists
who wanted to compute some data, the pure computer was almost useless if there
was no operator at hand who could interpret the output.
After Intel had presented his first microprocessor
in 1971, computers became smaller and inexpensive. With the Altair 8800 and the
Apple I, the era of Personal Computers (PC) began in 1976 and computers were
introduced to the general public. They were still very basic systems equipped
only with an operating system, keyboard for input and a text screen for output.
Users could interact with the computer via a Command Line Interface (CLI)
by typing appropriate commands in alpha-numerical letters. It took until 1983
that Apple introduced the first Graphical
User Interface (GUI) to the world of publicly available computers in its
Lisa PC. In this system interaction was eased by using metaphors from the
analogue world. The whole screen seemed like a desk where small icons symbolised
documents or specific functions. They could be activated by pointing at them
with the cursor that was controlled by the user via the mouse. One example: If
someone on such an interface picks a document on the screen and drags it to the
trashcan the according document will be deleted. Everyone could handle the
computer and documents like they were used to from their experience in office.
This was the first Direct Manipulation (DM) interface (see chapter 2.4 for
further details). One year later Apple introduced it's first Macintosh whose
basic marketing slogan was „The computer for the rest of us“ (Diskurs). The
philosophy was clear - having a machine that is easy to use for anyone -
without a thorough training. The DM interface consisted of four parts: Windows,
Icons, Mouse and Pointer (WIMP).
These basic components constitute a DM until today and try to ease human
interaction with the computer.
During the eighties and early nineties CLI systems
and DM systems coexisted in the consumer market. From the middle nineties on
WIMP is the common metaphor in user interfaces. An important milestone, the
Newton, a Personal Digital Assistant (PDA) operated only by a pen, was
introduced by Apple in 1993. It was another step in creating user centred
interaction with user friendly interfaces. Being ahead of its time, the Newton
never had been a great commercial success. But today so called hand-helds of
the size of the human palm are very common not only in the business world,
showing that proper UI design is a major concern when buying a computer.
Looking from the 1950s and 1960s when computers
were extremely huge, immensely expensive machines and very rare, we live in a
world that seems to be overtaken by technology. Computers and machinery have
gained a huge impact on our lives, anyone has to cope with them - the user
interface has gained more and more importance. In the 1950s, the hardware cost
90% of the whole system, and the rest was spend for software. Today, this ratio
has swapped (Eberts, 1994). This development emphasises the necessity of good
UI design. To speak with Eberts, "in the past, designers were constrained
by what could be done; now designers can think about, what should be
done".
Fifty years after the UNIVAC, "more than 72
million employed people in the US age 16 and over - about 54 percent of all
workers - used a computer on the job" (Census1). Even consumer products
are computerised. Computer chips can be found in almost any household today, be
it in washing machines, mobile phones, VCRs or microwaves. Back in the 50s technological constraints (memory, speed,
input & output channels) of early computer systems forced a concentration
on the functionality. Only few lines of the programming code were concerned
with the user interface. The end user had to be an expert to run the system.
Lifting the hardware limitations has freed resources for considerable efforts
to improve the user interface. "The effect of this rapid increase in the
number an availability of computers is that the computer interface, must be
made for everybody instead of just the professional of computer hobbyist"
(Eberts, 1994).