Every desk needs a chair. In the days of scientific management, the chair that went with the clerk's desk had to keep the employee working at maximum productivity. And since clerks were discouraged from leaving their desks, the chair had to keep them sitting. The best-designed chair for desk work was the swivel-based with a wooden saddle seat and slatted wooden back with armrests, based on physiological studies of human anatomy. Many models had adjustable knobs and levers to make the chair fit its occupant. This was the beginning of office ergonomics--the study of design as it relates to human comfort and function. Ergonomics would become a thriving architectural design business in the mid-1970s.
The chair has always been a status symbol in the office. Just as kings sat in thrones and no one else did, employers sat in armchairs while their clerks sat on stools. For all of the sensible comfort of the swivel chair, cane-seated straightbacks implied status in the 1880s. But when the typist's chair evolved into the cushioned armless versions in today's offices, the executive's chair took on kingly dimensions with closed arms, wide seats, and the tallest backrest.
By the 1950s, backrest height and seat size indicated job rank. Just as the bigger the desk the more prestigious the job, the more comfortable-looking the chair was, the higher up the organizational chart was the person who sat in it. There are probably few more powerful symbols in the office and contemporary life than the chair.
As we continue to barrel through the information age, it is hard to imagine conducting business without computers. Each day, millions of people working in offices and homes around the world depend on computer technology to do their jobs efficiently and economically. To truly understand the computer's history involves a daunting journey through mathematics, physics, and electrical engineering; through binary code, Boolean logic, real time, magnetic core memories, floating-point numerical notation, transistors, semiconductors, integrated circuits, and much, much more.
Luckily, most office workers do not need to understand this complex history to use computers and the dizzying array of software programs they execute. When computers were first developed nearly fifty years ago, the people who programmed them considered the task quite maddening. Fortunately, learning to use a personal computer today is often as simple as spending a few hours reading an instruction manual or following a hands-on tutorial.
In recent years, computer technology has been incorporated into a wide range of consumer and industrial products. Computers are routinely used in word processing, e-mail, video games, and other applications that require repetitive tasks that can be automated.
Emerging technologies are continually advancing the computer's capacity and usefulness, making "the computer" a difficult term to define. In the broadest sense, a computer is an information processing machine. It can store data as numbers, letters, pictures, or symbols and manipulate those data at great speeds by following instructions that have been stored in the machine as programs.
The first computers were not computers as we define them today. They were calculators--machines designed to solve complex mathematical problems. They reduced the extraordinary amount of time it took people just to attempt to solve the problems themselves. One of the largest mathematical nightmares of the precomputer age was analyzing the U.S. population data collected by the Census Bureau. The headcount itself took only a few months, but data analysis took years--and by then, the information was outdated.
Various inventors built machines to speed up mathematical computation. By 1941 a German engineer who hated engineering's mathematical drudge work had developed fast but limited relay calculating machines used in the German war effort.
In fact, military needs have played a major role in the development of the computer. When the United States entered World War II, the Ballistic Research Laboratory at Aberdeen Proving Ground had human "computers"one hundred (mostly female) college graduates who calculated the ballistic firing tables that were used for accurate weapons aiming. It took about three days to calculate a single trajectory, and two thousand to four thousand trajectories were needed for each weapon.
The Army soon realized that its human "computers" could not perform these calculations quickly enough. In 1945, the Army received financial support to develop a huge machine called ENIAC (Electronic Numerator, Integrator, Analyzer, and Computer), which weighed thirty tons, took up 1,800 square feet of floor space, and required six full-time technicians just to keep it running. Thousands of times faster than any of its predecessors, ENIAC demonstrated the unmistakable advantage of machine computing.
The UNIVAC, the first commercial computer system in America, followed in the 1950s. Office workers became accustomed to the separate areas--sometimes entire office floors--that housed the new machines and the programmers and technicians who knew how to use them. Data processing departments soon became commonplace.
As their technical capacities increased from handling only mathematical computations to manipulating words and other data, computers began to change the way many businesses did their work. Crews of mostly female keypunch operators, who put data into machine-usable form, became a new class of low-skilled labor. Despite their increased role in the workplace, computers were long considered strange and noisy machines housed in cold rooms down the hall.
Technological advances did help make computers smaller, faster, and extremely capable information handlers, but no more "friendly" to most office workers. By the 1970s, integrated circuit technology made producing a small and relatively inexpensive personal computer possible. Yet even with this available technology, many computer companies chose not to develop a personal computer. They could not imagine why anyone would want a computer when typewriters and calculators were sufficient.
The first personal computer--developed by Digital Equipment Corporation and Massachusetts Institute of Technology's Lincoln Laboratory in 1962--was intended for a single researcher and cost $43,000. Later personal computers were developed not by big corporations but by electronics buffs who typically read about computers, sent away for instructions and materials, and built them in their basements.
In 1976, a college dropout named Steve Wozniak and a teenager named Steve Jobs founded the Apple Computer Company, which made affordable computers designed for easy use. Eight years later, they introduced the Macintosh--a microcomputer with an intuitive user interface including familiar icons and a mouse. Meanwhile, Paul Allen and Bill Gates were busy with their new company, Microsoft. Microsoft's DOS (introduced in 1981) and Windows (introduced in 1985) programs would soon operate the majority of personal computers on the market.
Understanding and knowing how to program a computer are, for most users, irrelevant because thousands of inexpensive programs called software are available to perform almost any imaginable task. Using built-in rules and procedures, these programs offer fast and efficient ways to conduct business. Routine office tasks once performed by hand--such as data storage, correspondence, research, and report preparation are now computer-driven to such an extent that office typewriters, filing cabinets, and calculators are tools of the past.
In most offices of the 1990s, personal computers are linked to one other through internal--and often external--networks. This "networking" allows employees to gather information from a vast array of outside sources (particularly the World Wide Web) and to share it quickly with their colleagues, outside business partners, and customers.
The network to which a personal computer is linked now defines what it can do. Many different types of machines can be connected to a single internal or external network, including mainframes to handle large quantities of data and supercomputers designed for complex scientific work. All of these computers are invisible to the networked user, who can tap in and retrieve or process data that a personal computer by itself could not handle.
External information networks are accessed through modems (modulators/demodulators). Modems translate the digital language of the computer code into analog signals, which can be sent across telephone lines. The analog signals are then translated back into digital code for use by the receiving computer.
Modems provide access to the most widely used external information network--the Internet--which, in the late 1990s, reaches more than twenty-five million computer users (an increase from 213 registered computers in 1981). This represents considerable growth from the days of ARPANET, the "Mother of the Internet," which began as a U.S. government experiment linking researchers with remote computer centers to allow them to share hardware and software resources.
As new technologies are developed, personal computers will likely become even smaller in the future. They may also incorporate a greater number of data input and output methods (e.g., voice commands), efficiently interacting with one another because of greater software compatibility. In addition, computer information networks in public places--which began with the introduction of automated teller machines in the early 1970s--will likely become quite commonplace as more and more daily business is conducted electronically.
Today copiers are everywhere, making more copies than anyone needs. In the late 1980s, Xerox corporation copied more than 20 million pages in one year, just to see if its machines worked.
The first method for making a typed copy was carbon paper. Used for little more than credit card receipts today, carbons were once the bane of typists. Messy and unforgiving of mistakes, carbon paper enabled the typist to make a somewhat smeary duplicate of what he or she was typing.
The mimeograph machine of the 1890s, still in use today, particularly in schools, increased the number of copies that could be made from a few to a hundred, using what was known as a "master." But the only way to copy an original after it had been made was to retype, redraw, or rephotograph it.
The photostat machine was developed before World War I, but it was hardly an office tool. It was too expensive, too big, and, requiring a trained operator, too difficult to use.
After World War II, 3M and Eastman Kodak introduced the Thermo-Fax and verifax copiers into the workplace. The copies were of poor quality and continued to darken long after they had been pulled from the machine. Although the office models were relatively inexpensive and easy to use, their special paper eventually cost users a fortune.
Chester Carlton's discovery of the effect of light in photoconductivity, however, led to the unprecedented success of the "Xerox" machine. The first commercial Xerox machine, the Haloid Xerox 914 of 1960, had defects, such as paper scorching. Nevertheless, today's copiers produce near-perfect images, in color as well, in record time.
The 1876 Centennial Exposition in Philadelphia,which engendered an explosion of new furniture ideas, led to dramatic changes in typical office equipment. Rolltop desks and filing systems were suddenly the rage.
Businessmen were in the mood for a change. Improved housekeeping, they believed, must mean increased profits. A rolltop desk offered movable partitions, several sizes of pigeonhole cases, drawers, ledger cases, and a lock.
Typewriter desks went into the office with the typewriter. Some had cabinets built in that swung the typewriter out of sight as a writing surface swung up. Other typists' desks came in adjustable heights, allowing the typist to type standing as well as sitting.
By the 1890s, rolltops were becoming impractical. The office manager couldn't easily see what work his clerks were doing, and often too many papers were filed in their desks rather than in the filing cabinets. Soon rolltops were only managerial and executive furniture.
Office workers' desks became more and more streamlined as pigeonholes and filing slats were removed. By 1900, even the pedestals that supported the desk tops and provided storage space were replaced with legs, which made cleaning offices easier. Offices strove to be entirely standardized in appearance, for "efficiency." One desk, butted to another, was exactly like every other desk in the office.
Eventually, management considered wood inefficient and bad for employees' health, and metal desks became the standard by the 1920s. Wood now enjoys status as the material of choice for office desks, however, suggesting as it does, quality, success, and old-fashioned values.
The dictating machine is yet another example of the complex relationship between technological development and what people need and want. Thomas Edison's early phonograph, while a terrific idea, had dreadful sound, as well as a limited number of prerecorded wax discs (records) to play, and it wasn't selling well. The marketers of the phonograph thought to sell it to offices as a dictating machine, but it failed there as well. Stenographers hated it, and it was expensive. By the 1890s, it was already off the market. The phonograph, with critical technical improvements in recording quality, went on to achieve singular success.
Scientific management in its heyday, however, liked the dictating machine. It believed that not only would dictating letters into a machine cut the cost of producing a letter from 4.3 cents to 2.7 cents, but that the dictating machine could make the executive more creative. "Men who formerly dictated stilted letters have been taught by the dictating machine to express themselves lucidly," suggested one source. What scientific management was really trying to avoid was wasted time when the "dictator and stenographer engage in conversation entirely unconnected with the business at hand."
Some offices did use dictaphones, but most didn't. They were still cumbersome, intimidating, and poor recording machines. It took magnetic tape in the 1950s to make dictating practical. Dictating letters and memos and bright ideas into tape recorders, in cars as well as in executive suites, became standard office operating procedure that continues today, valued more by managers than by stenographers.
Facsimile is today's fastest-growing area of office automation and business communication. To the nontechnical observer, the fax machine seems to send a photocopy to another fax machine over the telephone lines: you dial a number, place the pages you want to send in the machine, press "start," and off they go, at about a minute a page.
Long before photocopying machines, the facsimile machine was invented in 1842 by Alexander Bain, a Scottish clockmaker, who used clock mechanisms to transfer an image from one sheet of electrically conductive paper to another. Bain patented the "automatic electrochemical recording telegraph" in 1843.
Various machines using Bain's technology have been in use for many years. In 1934 the Associated Press began to use "wirephoto" to transmit photographs. But then television brought a news revolution-people could see live or same--day footage of events rather than one or two photographs.
Only today has "fax" become a household word. The current facsimile revolution has come about because of digital technology (the same technology that lets us play video games), which has increased the speed, compactness, and reliability of the machines, as well as brought down prices. And, like Sholes's typewriter, this technology has found its real market in the business world, where efficiency and fast communication have been necessary since the days of the railroads. Fax machines make it possible to send anything that can be printed on a page to anywhere in the world in not much more time than it would take to hand the page to someone across the top of your desk.
Since the telephone was first demonstrated in 1876, it has evolved into a vital communications tool, providing the gateway to the world of computer technology and information exchange.
Businesses were the first to use telephones, taking advantage of the new technology to streamline their operations and maintain contact with their suppliers. By 1920, most businesses found it nearly impossible to survive without the telephone, finding it more convenient and efficient than sending messages by hand or by mail.
The early office phone was a black, rotary-dial desk model, the Model 500 series, introduced by Bell Telephone Laboratories in 1949. Each individual phone had its own phone number--there was no such thing as an "extension" in the days before multilined phones. These phones were typically answered by secretaries, and a stenographer listening on another extension would take notes on the call because businessmen still wanted a written record for the files.
After World War II, increased demands on the telephone system made it necessary to improve the technology. Radio waves, television waves, and computer data were now being transmitted through wires designed for voice transmission. By the 1990s, the analog-based telephone system had been rebuilt around digital technology. In a digital system, information about each sound wave--rather than the sound wave itself--is sent through the wire as a numerical equivalent and then reconstructed exactly at the other end. This new technology enables computers--which are also based on digital technology--to "talk" on the telephone. Today, fiber-optic cables carry digital signals at astounding rates, and they have become the backbone of the world's information infrastructure.
The integration of communications and computers gave offices a vast range of new telephone services, including expanded use of 800 numbers, cellular phones, and voice mail. Often maligned as cold, impersonal, and irritating, voice mail has taken over many of what were once the secretary's duties.
Living when people can "process words," instantly copy them, and even send them almost anywhere in the world over the telephone lines, we may find it hard to believe that the forerunner of the word processor, the typewriter, was invented little more than a hundred years ago.
This once-ubiquitous part of the American office, school, and home den followed a long road to recognition. The typewriter found acceptance only when its promoters finally realized who would be its most likely user. Before that could occur, however, social values that governed personal and professional correspondence had to change to admit the use of a mechanical device in place of the pen.
Practical writing machines became technologically feasible as early as the fourteenth century. The invention of at least 112 such machines preceded the successful Remington typewriter. Many of the early designs received patents, and several were marketed on a limited basis. The first such patent was issued to Henry Mill, an English engineer, in 1714. The first primitive American machine was patented in 1829 by William Burt of Detroit. Then in 1868, American inventor Christopher Latham Sholes developed the machine that finally succeeded on the market as the Remington and established the modern idea of the typewriter. Sholes's first try at a typewriting machine was a crude piece of work made with part of an old table, a circular piece of glass, a telegraph key, a piece of carbon paper, and piano wire. This led to an improved prototype resembling a toy piano in appearance, which is now in the Smithsonian's National Museum of American History.
Despite the importance of Sholes's improvements in the machine's mechanical workings over the next several years, the story of the typewriter from 1868 to its booming success in the late 1880s is really the story of its staunchest supporter, James Densmore. Under Densmore's prodding, Sholes improved the first crude machine many times over. Densmore was also responsible for recruiting the machine's first mass manufacturer, E. Remington and Sons, of Ilion, New York, a company that had made armaments during the Civil War and was looking for new products to manufacture.
The early typewriter's greatest problem was in finding a market. No one knew who would want to buy a typewriter. Sholes thought his most likely customers would be clergymen and men of letters and hoped that interest might then expand to the general public. Neither he nor Densmore saw the obvious utility of the typewriter in business. Sluggish economic conditions in the 1870s were partly responsible for this lack of marketing foresight. Imperfections in the typewriter itself may take another part of the blame. And, as hard as it is to conceive of today, Americans in the 1870s and 1880s were deeply uncomfortable with the strange notion of "mechanical writing." Convention prescribed that all letters be written out in neat longhand, and businessmen enjoyed no exception from this requirement.
The nineteenth-century response to a typewritten letter could have been something like our response to "junk mail"! In addition, typed signatures could be forged. Some accounts tell of recipients who were angered and insulted by typed letters, seeing them as a comment on their inability to read handwriting.
A marketing breakthrough finally occurred with the development of the concept of "scientific management" in the 1880s. With the specialization of work--some people doing correspondence, others keeping accounts, etc.--the typewriter at last found acceptance. People were ready to give up the old idea of business letters being governed by the same rules as personal letters when business became so big and impersonal that the change was possible.
The changing look of the typewriter offers vivid proof that the design of a manufactured object reflects a complex combination of social values, economic needs, and profit-driven motives. Most office equipment before 1940 was overtly mechanical and industrial in appearance. In the difficult economic times of the world depression of the late 1920s and the 1930s, offices had no trouble attracting workers, who would work anywhere, under almost any conditions, and with any equipment. The first changes in typewriter styling actually appeared not in office machines, but in portables, which from the early 1930s were streamlined and offered in color to encourage their use at home.
During the 1950s and 1960s, the entire environment of the office changed along with most office equipment. From about 1950, almost all office typewriter manufacturers presented their machines in colored steel cases that concealed the mechanism and suggested a certain elegance. If secretaries and typists were supposed to be above manual workers, it was important that typewriters not look like machines but convey a more respectable and less oppressive image.
The electric typewriter helped advance this new image. Although the first electrics were produced in the 1930s, they did not gain wide acceptance until the 1950s.
In the 1970s the typewriter had to compete with the word processor, a clever combination of the typewriter keyboard with the brain of the computer. Word processing let typists make mistakes, correct them, move things around, and change their mind in ways that would require endless retyping on a conventional typewriter. By the 1990s word processing became just another program (software) in personal computers.
Yet typewriters still have a place in some homes and offices. Office workers find typewriters faster for typing envelopes and other short jobs. It may be that in ten years the typewriter will be as rare as is carbon paper today. But some people are quite attached to their typewriters, even stubbornly holding on to manual machines with the same dedication seen in fountain pen users in this day of the felt tip pen!
Introduction || Birth and
Growth of the American Office || Office
Equipment
Office Organization ||
Global Office || Conclusion
Historical Timeline ||
Lesson Plans || Resources
|| Site Contents
Home (text) ||
Home (graphics)
Contact education@soe.si.edu with any questions or comments
Smithsonian Institution Copyright 1998