Edward Tenner’s new book addresses “efficiency paradoxes in several areas of society, including the online platform economy, the information industry, education, and transportation and travel in the age of the GPS. In a chapter called “The Managed Body,” Tenner considers how the wealth of new analytical tools and systems for medicine has affected our overall health. An excerpt:

The quantified self and diagnostic artificial intelligence share obstacles to true efficiency that are two of the most troublesome issues in medical technology: false positives (with their consequences, overdiagnoses) and toxic uncertainty. The simplest case of the former may be alarm fatigue. In “Why Things Bite Back” I pointed to the epidemic of false electronic detection of home burglaries and automobile theft.

Nearly 15 years later, in 2010, the problem had not changed. Even a representative of the security alarm industry acknowledged that eight out of ten police calls proved groundless, and many police departments were still levying fines for excess calls. Efforts to make alarm technology more discriminating while avoiding false negatives have evidently been slow. Car alarms, on the other hand, have ceased to be the urban plague they were in the early 1990s thanks to improved electronic key security and reprogramming to suppress alarms from innocent sources of vibration.

In medicine there is no equivalent of the electronic key fob. And there is every incentive for manufacturers of electronic devices to issue alerts for every possible risk to the patient; if one is disregarded because there are too many, hospital staff, not the manufacturer, will be held to blame. The efficiency of medical equipment in notifying doctors and nurses of potential problems predictably makes care less efficient.

In his book “The Digital Doctor,” the professor of medicine and pioneer of modern patient safety studies Robert M. Wachter cites a lawsuit that dramatized what has become known as alarm fatigue. An 89-year-old man at one of America’s premier hospitals, Massachusetts General, died from cardiac arrest even though 10 nurses had been aware of beeps at a central station and warnings displayed on signs in the hallway. A loud bedside alarm indicating a slowing heartbeat had been switched off by an unknown hand. Mass General settled the case for $850,000. The Boston Globe, investigating the event, discovered that between June, 2005, and June, 2010, at least 216 patients had died because alarms failed or because medical staff were fatigued by false warnings.

A Globe reporter found that in Boston Children’s Hospital, alarms were triggered by a child pumping his legs in bed, and by everyday activities like eating, burping, and working on a paper craft project. At Dr. Wachter’s own hospital at the University of California at San Francisco, there was an average of one alarm every eight minutes for each of the 66 or so intensive care patients, a total of 15,000 each day and 381,560 each month for only one of five alarm systems; together there were at least 2.5 million alerts each month in intensive care.

Serious as the hospital alarm problem has been, it is actually one of the more tractable unintended consequences of efficiency. The aviation industry has confronted false signals for decades and has developed a systematic hierarchy of warnings along with rigorous training in responding to them, as Dr. Wachter discovered in interviewing the hero pilot Chesley Sullenberger and taking controls of a simulator.

Because only a few giant corporations build the majority of long-distance commercial aircraft, alarms are not a patchwork of signals from many vendors but a single integrated and layered program. It starts with the most urgent visual, voice, and stick-shaking signals when a plane is about to stall and crash. A second level notifies pilots of conditions that require immediate action but don’t threaten the flight path; the color red is never used at this level. There are 40 of those “warnings.” Below them are 150 or so “cautions” that demand immediate attention but do not yet require any response. . .

Through experience, Boeing engineers have identified the sensors that are most likely to create false alarms, and thus to be ignored or even disabled, and have devised more accurate alternatives.

. . . Even the most carefully structured system of aviation alerts can lead to panicked reactions if automatic operation leaves pilots unprepared. In hospitals there is no autopilot, and there are many more situations to monitor. The staff are often required to balance multiple activities,and multitasking is known to degrade performance.

. . . The efficiency of gathering health information through tests and scans contrasts with the complexity of deciding on what treatment, if any, to apply. We have already seen that at least in the second decade of the 21st century, electronic health recordkeeping may have increased rather than reduced the administrative load on physicians, making it even harder for them to find time to explain and weigh options with patients. . . The efficiency and sensitivity of testing and of algorithmic analysis of medical records may actually make health care systems less efficient and less effective in promoting the health of the entire population.

In extreme cases, this anxiety can lead to what the writer Charles Siebert called toxic uncertainty, a painful consciousness of being at risk and not being able to make clear-cut decisions. Newspapers regularly report such cases.

Reprinted by permission of Alfred A. Knopf Publishing Co.

Facebook Comments