Tag Archives: operational analysis

Quis custodiet ipsos medici?

Internet security is in a position similar to that of safety in the medical industry. Many doctors have an opinion like this one, quoted by Kent Bottles:
“Only 33% of my patients with diabetes have glycated hemoglobin levels that are at goal. Only 44% have cholesterol levels at goal. A measly 26% have blood pressure at goal. All my grades are well below my institution’s targets.” And she says, “I don’t even bother checking the results anymore. I just quietly push the reports under my pile of unread journals, phone messages, insurance forms, and prior authorizations.”

Meanwhile, according to the CDC, 99,000 people die in the U.S. per year because of health-care associated infections. That is equivalent of an airliner crash every day. It’s three times the rate of deaths by automobile accidents.

The basic medical error problems observed by Dennis Quaid when his twin babies almost died due to repeated massive medically-administered overdoses and due to software problems such as ably analysed by Nancy Leveson for the infamous 1980s Therac-25 cancer-radiation device are not in any way unique to computing in medicine. The solutions to those problems are analogous to some of the solutions IT security needs: measurements plus six or seven layers of aggregation, analysis, and distribution.

As Gardiner Harris reported in the New York Times, August 20, 2010, another problem is that intravenous and feeding tubes are not distinguished by shape or color: Continue reading

What we can learn from the Therac-25

What does Nancy Leveson’s classic analysis of the Therac-25 recommend? (“An Investigation of the Therac-25 Accidents,” by Nancy Leveson, University of Washington and Clark S. Turner, University of California, Irvine, IEEE Computer, Vol. 26, No. 7, July 1993, pp. 18-41.)
“Inadequate Investigation or Followup on Accident Reports. Every company building safety-critical systems should have audit trails and analysis procedures that are applied whenever any hint of a problem is found that might lead to an accident.” p. 47

“Government Oversight and Standards. Once the FDA got involved in the Therac-25, their response was impressive, especially considering how little experience they had with similar problems in computer-controlled medical devices. Since the Therac-25 events, the FDA has moved to improve the reporting system and to augment their procedures and guidelines to include software. The input and pressure from the user group was also important in getting the machine fixed and provides an important lesson to users in other industries.” pp. 48-49

The lesson being that you have to have built-in audit, reporting, transparency, and user visibility for reputation.

Which is exactly what Dennis Quaid is asking for.

Remember, most of those 99,000 deaths a year from medical errors aren’t due to control of complicated therapy equipment: Continue reading

What about the Therac-25?

Someone suggested that Dennis Quaid should be reminded of the Therac-25 “if he thinks computers will reduce risk without a huge investment in quality, quality assurance and operational analysis.” For readers who may not be familiar with it, the Therac-25 was a Canadian radiation-therapy device of the 1980s that was intended to treat cancer. It had at least six major accidents and caused three fatalities, because of poor software design and development.

Why should anyone assume Dennis Quaid doesn’t know that quality assurance and operational analysis are needed for anything designed or controled by software? The man is a jet pilot, and thus must be aware of such efforts by aircraft manufacturers, airlines, and the FAA. As Quaid points out, we don’t have a major airline crash every day, and we do have the equivalent in deaths from medical errors. Many of which could be fixed by Computerized Physician Order Entry (CPOE).

Or ask the Mayo Clinic: Continue reading