I'm reading the book "Set Phasers on Stun" by Steven Casey, which is a number of true stories about human error causing accidents. In each of these stories, someone made a mistake or a slip that led to something bad. There is a story in which a nurse accidentally connectd a power cord to an EKG lead and electrocuted a patient. The official investigation blamed it on human error - the nurse should have checked the cable. But upon looking at this from a human factors engineering perspective, the nurse worked with dozens of cables - they didn't fit together if they weren't meant to. The power cord was right beside the EKG machine and the two connectors fit perfectly. So was bad design part of the problem? The pyschology of the situation led the nurse to commit the error - but it was the design that led to this. There are other examples. Con Edison power outage in 1977. The operator couldn't make the right decision to deal with 2 knocked out lines. But he didn't have a general overview of the system from his console and had to resort to talking over the phone, wasting precious time. The investigation pinpointed that the operator failed to take necessary action. But - was the design of the display also a contributory issue? Why didn't he have the needed information? Can we blame humans for making errors that are caused by poor design? Or are engineers equally liable for the human errors during the use of their design? I think that this is a tricky issue. Users are supposed to know how to use a system and errors do happen because of incompetence. But at the same time, if a design is bad in that it can lead to errors, what are the designer's responsibilities? Should designers have a responsibility to minimize human error to the point where they could be liable for errors arising from their use of design? Certainly, the design is safe if used properly. The issue is whether designers are responsible for human errors because their designs allow these errors to occur? I don't think they should be liable, because errors happen all the time - even the most usable system is susceptible to it. But there are certainly designs that are unforgiveably hard to use. The title story of the book is most certainly a design problem. A radiation machine killed someone because the operator made a slip in typing and quickly corrected it - but the machine couldn't process it. The Therac fired a full powered beam without a shielding plate. The operator didn't know that there was a problem (it just said Malfunction), so the machine was reset. It was fired again. The patient came out, in pain. He died afterwards. It could be argued the nurse shouldn't have resetted the machine after seeing "Malfunction". But it can equally be argued that the machine should never have let the beam fire when the shielding plate was off - and that it should at least have told the user that the shielding plate was off.