Corrections or additions?
This excerpt from Edward Tenner’s book "Our Own Devices" was
prepared for the June 11, 2003 issue of U.S. 1 Newspaper. Copyright
Edward Tenner.
Thumbs Up: Body Tech
Edward Tenner’s latest book, "Our Own Devices, the Past and
Future of Body Technology," offers a fascinating walk through
the development of "body technology" devices and the social
adaptations of the techniques for using them. Integrating his
observations
and interpretations of society, business, history, and culture, Tenner
delves into the individuals and companies that developed and marketed
these products. He concludes with an epilogue on the thumb and the
possible future of man and machine fused into a cyborg.
This essay, adapted from the epilogue, was first published in
the Wilson Quarterly.
by Edward Tenner
For more than 50 years, enthusiasts have proclaimed
the coming of a new age of technologically augmented humanity, a
somewhat
unsettling era of bar-coded convicts and chip-implanted children.
But technology has been reshaping the body since the very dawn of
civilization. The feet of shod people, for example, are
physiologically
different from those of people who have always walked barefoot.
Technologies as various as the thong sandal and the computer mouse
have affected how we use our bodies — the techniques we
employ in our everyday lives — and this coevolution of technology
and the body has not always followed the course engineers and other
designers imagined. The question now is whether mind, body, and
machine
will fuse in some radical new way over the next generation.
The enthusiasts themselves are far from agreement on the mechanism
that might achieve such a fusion. For some, the new intimacy between
humans and machines will simply involve more portable and powerful
versions of devices we already take with us — computers, for
example,
that might be carried as we now carry cell phones and personal digital
assistants (PDAs), to be viewed through special eyeglass displays.
Spectacles might also transmit the emotional states of their wearers,
so that a speaker, for example, could detect an audience’s interest
or boredom. There are already sneakers that can transmit or record
information on a runner’s performance, and motorcycle helmets with
intercoms and navigational aids built in.
Other enthusiasts scorn mere wearability. They’re having
sensors and transmitters surgically implanted in their bodies —
as, for example, some deaf individuals have been fitted with cochlear
implants that restore hearing. The cyborg, or human machine, is an
especially powerful and persistent notion, perhaps because it seems
a logical next step from technological symbiosis. (Politically, the
cyborg idea — which for a few enthusiasts is a movement —
spans a continuum from Paul Verhoeven’s original Robocop film in 1987
to the work of cultural scholars such as Donna Haraway and Chris
Hables
Gray, who see the connection between human and machine as an
emancipatory
strategy against rigid economic and gender roles.)
But is the body really becoming more mechanized? Is the interaction
of technology and human behavior all that new and frightening? Despite
the legend, George Washington never wore wooden teeth, but his last
pair of dentures, made of gold plates inset with hippopotamus teeth,
human teeth, and elephant and hippo ivory, and hinged with a gold
spring, were as good as the craftsmen of his time could produce.
Still,
he suffered great discomfort, and ate and spoke with difficulty
(perhaps
the enforced reserve enhanced his dignity). At any rate, if the
nation’s
first president was a cyborg, it’s not surprising that one in 10
Americans
had some nondental implant — from pacemakers to artificial joints
— by 2002. Nor was Washington an isolated case: Benjamin
Franklin’s
bifocals and Thomas Jefferson’s semireclining work chair were giant
steps in human-mechanical hybridization. One might even say that John
F. Kennedy was continuing the cyborg tradition when he became one
of the first politicians to adopt the robotic signature machine, a
giant and distinctively American step in the cloning of gesture.
The many amputations wounded soldiers suffered during the U.S. Civil
War led to the creation of an innovative artificial-limb industry.
Today, responsive advanced prosthetics, wheelchairs, vision implants,
and other assistive devices exceed the 19th century’s wildest dreams.
(There has even been litigation in the United States over whether
a teenage swimmer with an artificial leg was unfairly barred from
wearing a flipper on it.)
But the first choice of medicine is still the conservation of natural
materials and abilities. Thus, the trend in eye care has been from
spectacles to contact lenses to laser surgery, and dentistry has moved
steadily from dentures to prophylaxis and the conservation of
endangered
natural teeth. Some dental researchers believe that adults may be
able to grow replacement teeth naturally. Other forms of regeneration,
including the recovery of function by paraplegics and quadriplegics,
may follow.
The body remains surprisingly and reassuringly conservative, and
humanity
has stayed steadfastly loyal to objects that connect us with our
environment.
The traditional zori design — the sandal with a v-shaped thong
separating the big toe from the others — is still used for some
of the most stylish sandals. Athletic shoes with the most technically
advanced uppers and soles still use a system of lacing at least 200
years old. For all their additional adjustments, most advanced new
office chairs still rely on the 100-year-old principle of a
spring-mounted
lumbar support, and recliners still place the body in the same
contours
that library chairs did in the 19th century; according to industry
sources, interest is fading in data ports built into recliners and
in other technological enhancements.
The QWERTY arrangement of the keyboard has resisted all reform, and
alternatives to the flat conventional keyboard are expensive niche
products, partly because, in the absence of discomfort, so few users
are willing to learn new typing techniques. A century after the piano
began to lose prestige and markets, it remains the master instrument,
with a familiar keyboard.
Computers now allow the production of advanced progressive eyeglasses
without the visible seam of bifocals, but wearers still hold them
on their heads with the folding temples introduced in the 18th
century.
The latest NATO helmet still reflects the outlines of the medieval
sallet. But then, our skulls — like our foot bones, vertebrae,
fingers, eyes, and ears — have not changed much. Even the
automatic
transmissions in our cars rely on a familiar tactile principle, a
knob or handle and lever; the seemingly more efficient pushbutton
shifter was largely abandoned after the Edsel. And the 21st century’s
automobiles are still directed and controlled by wheels and pedals
— familiar from early modern sailing ships and wagons — rather
than by the alternative interfaces that appear in patents and
experimental
cars. Meanwhile, many technological professionals study body
techniques
that need few or no external devices: yoga, martial arts, and the
Alexander technique (a series of practices developed by a 19th-century
Australian actor to promote more natural posture, motion, and speech).
Even Steve Mann, the Christopher Columbus of wearable computing, has
misgivings about integrating himself with today’s "smart"
technology. Mann, who holds a PhD in computer science from the
Massachusetts
Institute of Technology, was photographed as early as 1980 wearing
a helmet equipped with a video camera and a rabbit-ears antenna. But
in his book "Cyborg" (2001), he acknowledges being
"increasingly
uncomfortable with the idea of a cyborg future," where privacy
is sacrificed for pleasure and convenience to a degree he compares
to drug addiction.
Today’s advanced cyborg technology is a harbinger of neither a utopian
nor an apocalyptic future. Virtual reality helmets, often featured
in scare scenarios of the future, are still not playthings; they’re
professional tools demanding rigorous training in physical and mental
techniques if wearers are to avoid disorientation and lapses in
judgment.
At the other extreme of complexity, the miniature
keyboards
of cell phones and other devices are exerting a surprising influence
at the level of everyday life. They’re shifting the balance of power
of the human hand from the index finger to the thumb. C. P. E. Bach
elevated the role of the thumb in musical keyboarding 250 years ago,
but touch-typing pioneers of the 20th century rediscovered the fourth
and fifth fingers and banished the thumb to space bar duty. Now the
thumb is enjoying a renaissance. It has returned to computing with
the introduction of pen- and pencil-like devices such as the styluses
used with PDAs.
The latest computer mouse, developed by the Swedish physician and
ergonomist Johan Ullman, is gripped and moved around the desk with
a pen-shaped stick that uses the precision muscles of the thumb and
fingers and doesn’t twist the hand and tire the forearm. Even
thumb-dependent
pencils are resurgent, their unit sales having increased by more than
50 percent in the United States in the 1990s.
The biggest surprise is the thumb’s role in electronics. In Japan
today, so many new data-entry devices rely on it that young people
are called oyayubi sedai, the Thumb Generation. In Asia and
Europe, users have turned technology on its head: Instead of using
the voice recognition features of their phones, they’re sending short
text messages to friends, thumbs jumping around their cellular
keyboards
in a telegraphic imitation of casual speech. By spring 2002, there
were more than 1.4 billion of these transmissions each month in the
United Kingdom alone.
One British researcher, Sadie Plant, has found that thumbs all around
the world are becoming stronger and more skillful. Some young Japanese
are now even pointing and ringing doorbells with them. As Plant told
The Wall Street Journal, "The relationship between technology
and the users of technology is mutual. We are changing each
other."
Always attuned to social nuance, the Style section of the Washington
Post also noted the ascent of the formerly humble digit. The major
laboratories did not predestine the thumb to be the successor to the
index finger, though they did help make the change possible; its full
capacities were discovered through collaborative experimentation by
users, designers, and manufacturers.
The ascendancy of the thumb is an expression of the intimate
relationship
between head and hand described by the neurologist and hand injury
specialist Frank Wilson, who speaks of the "24-karat thumb"
in his book "The Hand" (1998): "The brain keeps giving
the hand new things to do and new ways of doing what it already knows
how to do. In turn, the hand affords the brain new ways of approaching
old tasks and the possibility of understanding and mastering new
tasks."
But change is not without cost. We learn new body skills to the
neglect
of others, and humanity has been losing not only languages but body
techniques. Scores of resting positions known to anthropologists are
being replaced by a single style of sitting. Countless variations
of the infant-feeding bottle compete with the emotional and
physiological
rewards of nursing. The reclining chair, originally sold partly as
a health device, has become an emblem of sedentary living. The piano’s
advanced development in the late 19th century prepared the way for
the player piano, and ultimately for recorded music. Typewriter and
computer keyboards eliminated much of the grind of learning
penmanship,
along with the pleasure of a personal hand (today’s children may still
grumble, but rarely must they learn the full, demanding systems of
the 19th-century master penmen). The helmet wards off danger even
as it encourages overconfident wearers to engage in new and dangerous
activities. All these devices augment our powers, but in doing so
they also gain a power over us.
The challenge within advanced industrial societies is to cope with
a degree of standardization that threatens to choke off both new
technologies
and new techniques. We need a return to the collaboration between
user and maker that marked so many of the great technological
innovations,
whether the shaping of the classic American fire helmet or the
development
of the touch method by expert typists and typing teachers.
Research in even the most advanced technical processes confirms the
importance of users. In the 1980s, for example, the economist Eric
von Hippel studied change in high-technology industries such as those
that manufacture scientific instruments, semiconductors, and printed
circuit boards. Von Hippel found that up to 77 percent of the
innovations
in the industries were initiated by users. He therefore recommended
that manufacturers identify and work with a vanguard of "lead
users" — as was done in the past, for example, when
19th-century
musicians worked with piano manufacturers, or when the typewriter
entrepreneur James Densmore tested his ideas with the court reporter
James O. Clephane in developing the QWERTY layout, an efficient
arrangement
for the four-finger typing technique that prevailed until the victory
of the touch method in the 1890s.
Today’s cognitive psychologists of work are rejecting the older model
of a single best set of procedures and learning from the experience
of workers and rank-and-file operators how equipment and systems can
be modified to promote greater safety and productivity. As one
psychologist,
Kim J. Vicente, has written, "Workers finish the design."
Design should be user friendly, of course, but it should
also be user challenging. The piano keyboard is rightly celebrated
as an interface that’s at once manageable for the novice and
inexhaustible
for the expert. Information interfaces should similarly invite the
beginner even as they offer the experienced user an opportunity to
develop new techniques; they should not attempt to anticipate a user’s
every desire or need. The practice of participatory design, introduced
in the 1970s by the mathematician and computer scientist Kristen
Nygaard,
began with Norwegian workers who wanted a say in the development of
technology in their industries and was ultimately embraced by
corporations
worldwide.
The keyboard that’s negotiated with a thumb is a threat to handwriting
traditions, whether Asian or Western, and that’s regrettable. But
adapting to its use is a mark of human resourcefulness and ingenuity.
The thumb, a proletarian digit ennobled in the digital age, is an
apt symbol for a new technological optimism based on the self-reliance
of users. The index finger — locating regulations and warnings
in texts, wagging and lecturing in person — signifies authority,
the rules. The thumb, by contrast, connotes the practical knowledge
men and women have worked out for themselves, the "rules of
thumb."
It represents tacit knowledge, too, the skills we can’t always
explain,
as with a "green thumb." And when extended during the almost
lost art of hitchhiking, the thumb displays the right attitude toward
the future: open and collaborative, but with a firm sense of
direction.
associate of the Lemelson Center for the History of Invention and
Innovation at the National Museum of American History. Tenner also
has been a visiting researcher at Princeton in the departments of
Geosciences and English.
In 1996 he wrote "Why Things Bite Back: Technology and the
Revenge of Unintended Consequences," and he has contributed essays
to newspapers and magazines of the U.S. and the U.K.
He now writes mainly for US News and World Report, the Wilson
Quarterly,
Technology Review, Raritan Quarterly Review, American Heritage of
Invention and Technology, and Designer/Builder.
Corrections or additions?
This page is published by PrincetonInfo.com
— the web site for U.S. 1 Newspaper in Princeton, New Jersey.
Facebook Comments