Originally Published MDDI May 2002HUMAN FACTORS DESIGN Manufacturers can help reduce use error in the clinical setting by integrating good human factors practices and user-friendly design into their medical devices.

May 1, 2002

15 Min Read
Eleven Keys to Designing Error-Resistant Medical Devices

Originally Published MDDI May 2002

HUMAN FACTORS DESIGN

Manufacturers can help reduce use error in the clinical setting by integrating good human factors practices and user-friendly design into their medical devices.

Michael E. Wiklund

0205d86a.jpgToday, you may have made a mistake while operating a device. Perhaps you turned on the stove's back burner instead of the front one. Or did you switch on the car lights when you meant to turn off the windshield wipers? You may not remember. Such errors are part of daily life.

The reality is, people are not perfect, and neither are devices. Indeed, some people are especially prone to error. Similarly, the designs of some devices seem to invite errors in their use.

Most of the time, we can live with these deficiencies. But society expects more from medical workers and the devices they use. We expect doctors, for example, always to operate critical medical devices with skill and attention. Similarly, we expect those devices to be mechanically flawless, and to lessen the effects of any use errors that do occur. So when we hear the occasional news report of patient injury or death caused by a human mistake or device failure, we are shocked. Such mistakes are not supposed to happen in medical settings.

But, in fact, use error is quite common in the practice of medicine, as underscored by the 1999 Institute of Medicine report, To Err Is Human.1 Thinking realistically about medical care, we should not be surprised by its substantial rate of use error. Time pressures and fatigue are recognized as key factors leading to use error, and are abundant in the emergency room, intensive-care unit, outpatient clinic, and most other healthcare environments.

THE DESIGNER'S RESPONSIBILITY

Fortunately, medical device designers can play an active role in preventing, or at least mitigating, the effects of use error in medicine. Designers in other industries have succeeded in making nuclear power plants and airliners safer by designing their control centers to be less error-prone. For example, many of the controls built into aircraft cockpits are shape coded to avoid confusion when grasped by a pilot who is flying at night in heavy turbulence. The lever used to raise and lower the landing gear, for instance, is capped with a small wheel-shaped part, and the lever for positioning the flaps is topped with a little model flap. In a nuclear power plant control room, critical controls, such as the reactor-shutdown button, are surrounded by guards that prevent the buttons' accidental actuation—a use error that could cost the utility company several million dollars.

The medical industry has already made efforts to minimize the potential for use errors. For instance, over the past decade or so, manufacturers have worked closely with the U.S. government and national and international standards organizations to make anesthesia delivery safer. New technologies make possible the noninvasive monitoring of blood oxygen saturation and breath-by-breath carbon-dioxide levels, and the alerting of caregivers when patients might be in danger. Anesthesia machines have been improved to guard against use errors of the past, such as mistakenly turning off the flow of oxygen to a patient, neglecting to restart the ventilator following intubation, or filling an anesthetic-agent vaporizer with the wrong fluid.

Such safety improvements have resulted from detailed risk analysis and abatement efforts involving the identification of possible failure modes and effects, and either eliminating the hazard or putting protections in place. Through its regulatory actions, FDA has influenced manufacturers to make safety improvements as well, and companies have responded by applying good human factors practices at the device design stage.

GOOD HUMAN FACTORS PRACTICES

Attaching a miniature wheel to the tip of the landing gear lever to visually and tactilely reinforce the control's function is good human factors practice. The enhancement is no guarantee that a pilot will never mistake the landing-gear lever for another type of control, but it certainly reduces the chances. The same point holds true for various enhancements that can be made to anesthesia workstations, such as ensuring that oxygen valves on all machines turn in the same direction to increase flow.

Decades after the emergence of human factors engineering as a technical discipline and the establishment of good design practices, however, the number of medical devices with basic human factors shortcomings is substantial. Fortunately, FDA has initiated regulations and enforcement mechanisms in the past few years that should help eliminate these human factors flaws. Manufacturers are now responsible for conducting human factors–related design studies, which include performing usability tests, to demonstrate to the agency that a device is suitable for use. A manufacturer who lacks evidence of device usability, which relates directly to safety, runs the risk of not receiving FDA approval to bring its product to market. Product designers are therefore seeking even more detailed guidance on accepted human factors processes and design principles.

Human factors guidance is already available from many sources outside of FDA documents. Traditionally, medical device designers have looked to military standards, guidelines published by the Association for the Advancement of Medical Instrumentation (AAMI),2 and an assortment of human factors or ergonomics textbooks for information and instruction.

The sheer volume of guidance can sometimes be overwhelming, and detailed recommendations can be contradictory or mutually exclusive. To resolve these problems, AAMI is working toward producing a comprehensive, all-inclusive source of guidance within the next few years.

Experts already agree, however, that a handful of design practices are especially important for protecting against common use errors. Discussions among several human factors professionals and medical device regulators familiar with common device faults yielded the following guidelines. Though incomplete, the guidelines represent a reasonable starting point for thinking about designing an error-resistant medical device.

GUIDELINES FOR DESIGN

1. Guard Critical Controls. Controls can be vulnerable to accidental and unauthorized actuation. As an example, a caregiver might accidentally bump up against a ventilator control and start or stop a critical function. The conventional solution is to guard the control so that its actuation requires a deliberate action, like pressing and holding the power-on key to turn the machine on or off.

Guards can take many forms. Push buttons can be recessed or surrounded by a raised collar. Levers can incorporate interlocks, requiring the user to actuate a release mechanism before he or she can move the lever. Car makers recently adopted this approach to ensure that drivers apply the foot brake before moving an automatic transmission into drive. Some devices that incorporate a software user interface, such as a patient-programmable analgesic pump, require the caregiver to enter a code or password before operating the device. This approach is an effective way to keep unauthorized individuals—particularly hospital visitors—from meddling with control settings.

2. Confirm Critical Actions. This design tip is closely related to guarding critical controls. The idea is to give users a chance to reconsider critical actions that are not easily reversed and to correct their mistakes. Software products often employ this strategy by requiring users to confirm a destructive action, such as deleting a file. Requiring confirmation lessens user frustration and the potential loss of an important document, although it might increase task time. In the case of medical devices, confirmation can also help prevent the loss of a file, such as a patient record, and it can prevent users from administering the wrong therapy if the user blunders and presses the wrong key. In fact, standards and regulations now mandate a two-step approach to particularly critical tasks.

3. Make Critical Information Legible and Readable. For the average medical device, all information is not created equal. Some information, such as blood pressure values or alarm messages, is vitally important and must be presented in a strictly reliable manner. There is no room for error, such as mistaking a "7" for a "1" or an "8" for a "3," because such errors could lead to inappropriate therapy. Therefore, information legibility and readability is key.

One way to make data more accessible and visible is to make it very large. That way, it is more likely to be legible even if it's presented on a smudged display that might also be reflecting glare at the time, or if it is read from a great distance. Another strategy is to ensure that displayed information contrasts sharply against its background; for example, use black characters on a white background, or the reverse. The display characters should also have an appropriate stroke height-to-width ratio so that numbers and letters are easy to discriminate.

Color-coding and segregating information is another method for helping data stand out against a potentially congested background. Finally, critical information should be placed on the surface of the device's user interface—on a control panel or top-level display, for example—rather than hidden behind a cover or in the depths of the display hierarchy.

4. Simplify and Ensure Proper Connections. Caregivers spend a considerable portion of their workday managing collections of cables, wires, and tubes that are often referred to as "spaghetti." Precious minutes are spent sorting out lines, routing them from patient to device, locating the right ports, and making secure connections. Accordingly, anything a manufacturer can do to simplify these tasks and ease equipment setup is helpful. Moreover, manufacturers must consider the myriad ways in which poor cable, wire, and tube management can lead to serious errors. In two highly publicized and tragic incidents, for example, an infant respiratory monitor's ECG leads were improperly connected to a line power supply (one case involved an extension cord, the other a wall receptacle), and the monitored child was electrocuted.3

One protective measure to prevent accidents such as these is keying the connectors and associated ports, thereby making it physically impossible to insert the wrong cable or tube into a particular port. Visual and tactile cues, such as color- and shape-coded ports, provide additional protection by establishing associations, or mental dovetails. All other types of equipment and receptacles that might be present in the use environment also should be considered to ensure protection against improper connections. In the case of the children's apnea monitors, the leads could have been shaped and sized to prevent inadvertent insertion into a power receptacle.

5. Use Tactile Coding. Because caregivers must focus on several tasks at once, they are not always looking at a device while operating it. For example, an interventional cardiologist might be watching a television monitor for most of a case, manipulating surgical devices such as balloon catheters and dye-injection devices by feel, and possibly even operating them with a foot switch. Therefore, it is important to make devices and their associated controls recognizable by touch alone.

Tactile cues include the feel of a switch, the force required to actuate it, and the distance the switch travels. It is also possible to add audible cues, such as clicking and beeping sounds.

6. Prevent the Disabling of Life-Critical Alarms. The debate continues over permitting caregivers to disable the alarms built into life-critical devices. Proponents claim that caregivers need, and in fact demand, control over alarms so that beeping and flashing indicators will not cause distraction or produce a "cry-wolf" syndrome. Opponents claim that alarm systems perform a critical safety function that outweighs the nuisance factor. Both arguments are compelling; however, the arguably safest approach is to prevent the disabling of alarms while also making alarms smarter.

Some smart devices allow users to silence, but not disable, alarms for a predetermined period of time when a case is proceeding normally but the conditions of the moment, e.g., apnea during anesthesia induction, would trigger an alarm. Because of the time limit, such alarms will still serve their purpose if a benign condition persists until it becomes dangerous. Turning alarms off altogether increases the chance that dangerous conditions will be undetected—an unacceptable, albeit rare, outcome.

7. Present Information in a Usable Form. Converting information from one form to another introduces the opportunity for error and makes additional work. Presenting information in an immediately usable form is preferred. Values should be presented in their appropriate units of measure. Otherwise, alarm or failure codes can be misinterpreted, and cryptic abbreviations or acronyms can be misinterpreted as well. Additionally, unit conversions can be performed incorrectly. To prevent these mishaps, users should be provided immediate or direct access to information in its final, most usable form.

The same rule applies to forming functional associations. It is more effective to place related information, such as waveforms and numerical values, within the same functional grouping rather than making users recall and integrate the information in their minds. Accordingly, designers should look for ways to take the cognitive workload out of information displays, without depriving users of valuable details.

8. Indicate and Limit the Number of Modes. Designers often introduce operational modes into devices in an attempt to simplify their operation. A ventilator used in a critical-care setting may incorporate pressure- or volume-controlled modes, for example, to initiate different therapeutic regimes. Other kinds of devices can have pediatric and adult modes. While operating modes can improve care and save work under normal circumstances, however, they also introduce the potential for operating in the wrong mode.

Consider the case in which a patient monitor was inadvertently left in demonstration mode and then used during actual patient care.4 The doctors using the monitor reportedly detected a mismatch between their clinical observations and the data displayed on the monitor: the digital readout of the patient's blood pressure never varied from 120/70. Growing suspicious of the extremely stable data, the doctors finally noticed that the monitor was locked into "demo" mode, as indicated by a discrete message on the monitor's screen. The lesson to manufacturers is to conspicuously indicate a device's operational mode so that it is apparent at a glance. Limiting the number of modes to just a few that users can commit to memory is helpful. Besides, users usually prefer to operate devices in the familiar "standard" mode; they typically disregard more-advanced modes.

9. Do Not Permit Settings to Change Automatically. Few things frustrate users as much as devices resetting themselves or unexpectedly changing their operational state without the user knowing. A change in operational state has reportedly occurred when devices have lost power or when their power supplies are changed. One consequence of an unwanted change—such as a return to default values—can be the suspension or alteration of an ongoing therapy that places patients at risk. Minimally, device displays should boldly indicate any changes that were not initiated by the user. Ideally, devices should give users full control of important settings under all conditions.

10. Reduce the Potential for Negative Transfer. Negative transfer occurs when a user applies her or his experience using one device to another one, even though the second device does not function the same as the first. This can be a problem when patterned behavior, i.e., rote task performance, leads to a negative outcome. Take, for example, the caregiver who is accustomed to changing the parameters on a therapeutic device by pressing arrow keys. A similar device requires the user to press arrow keys, then press a confirmation key. If the caregiver switched to the second device, he or she would be likely to enter the new value without confirming it. The consequence, attributable to negative transfer, would be that the device would still be set to the original parameter value.

The solution is to identify industry conventions or standards related to a particular device that is under development, as well as those of other devices used within the same care environment. Those conventions or standards should be followed, unless there is a compelling reason to diverge, in which case substantial divergence is preferable because it will reduce the chance of negative transfer. Minor differences might invite users to confuse the operation of the two devices.

11. Design in Automatic Checks. As microprocessors make devices smarter, adding software routines that detect possible use errors becomes more feasible. Some devices can alert users to unusual or potentially dangerous settings, such as a particularly high setting on an analgesic pump controlled by a patient. This error-prevention strategy can be thought of as an extension of a device's alarm system.

CONCLUSION

Together, these design guidelines and human factors practices form one part of an overall strategy that helps reduce the occurrence of device use errors. Intuitive, ergonomic, and smarter devices can help caregivers do their jobs better, which in turn leads to better patient outcomes and fewer mishaps.

ACKNOWLEDGMENTS

The following human factors professionals provided input to this article: George Adleman (Siemens Medical Systems; Danvers, MA), John Gosbee (Veterans Health Administration; Ann Arbor, MI), Rod Hassler (Alaris Medical Systems; San Diego), Bill Muto (Abbott Laboratories; Irving, TX), Dick Sawyer (FDA, Rockville, MD), and Eric Smith (American Institutes for Research; Concord, MA).

REFERENCES

1. L Kohn, J Corrigan, and M Donaldson, eds., To Err Is Human: Building a Safer Health System (Washington, DC: Institute of Medicine National Academy Press, 2000).

2. "Human Factors Design Process for Medical Devices," ANSI/AAMI He74:2001, (Arlington, VA: Association for the Advancement of Medical Instrumentation, 2001).

3. ML Katcher, MM Shapiro, and C Guist, "Severe Injury and Death Associated with Home Infant Cardiorespiratory Monitors," Pediatrics 78, no. 5 (1986):775–779.

4. GB Ramundo and DR Larach, "A Monitor with a Mind of Its Own," Anesthesiology 82, no. 1 (1995): 317–318.

Michael E. Wiklund works for the American Institutes for Research and is a frequent contributor to MD&DI.

Illustration by Nigel Sandor

Copyright ©2002 Medical Device & Diagnostic Industry

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like