Gen. George Patton bemoaned the fact that an infantrymen‘s rifle bullet ricocheting off a piece of armor combined with a common fuel system leak could ignite a gasoline fire or explosion in one of his tanks. The German diesel-powered tanks didn't have this weakness. Diesel fuel doesn't have sufficient vapor pressure to create an explosive mix at ambient temperatures. No amount of armor could make up for this difference.
IEEE 279-1971 required that land-based nuclear stations test their safeguard systems through every potential point of failure. Many parameters are monitored by four independent sensors, each powered by separate sources. Their outputs are controlled by a 2/4 logic, so that even if one signal failed LO and one (by some feat of the imagination) failed HI, the remaining two of the four could cause the safety action to occur. When you combine this with a protective action like containment isolation, which involves nearly 100 electromechanical actions, you begin to get an idea of just how extensive the testing regimen can be. Every sensor, every setpoint switch, every output relay, every logic card, every signal to actuator and every final device must be tested. And, since you can't afford to shut the feedwater regulating valves or the main steam isolation valves while operating, you must either perform the test during the precious few weeks of plant shutdown for refueling, or you must perform overlapping testing of each segment.
I like how Trevor Kletz put it so succinctly: whatever you don’t have, can’t leak.
That’s inherent safety.
That and using the simplest safety systems possible.
Consider, for example, the desire to shut off a flammable gas from entering a building if the line inside should be severed. You could install gas monitors at strategic places where the lines pass or the loads exist. Then, take a control output from those monitors to actuate an automatic valve to isolate the gas main. Pretty standard approach, right? Now, start to list the many ways in which that safety system could fail to perform when circumstances demand: sensor blocked by dust or paint, sensor drifts out of calibration, setpoint set incorrectly, logic board failure, driver output failure, power supply failure, loss of air to valve actuator, mechanical binding of the valve, etc.
Next, consider an excess flow check valve, a simple mechanical device that works like a gated diode, allowing reverse flow only when gated. In this case, the valve is purposely installed backwards to stop forward flow when tripped, and the “gate” is differential pressure. When the differential pressure across the valve gets too great, the gate is lost, and the valve mechanically slams shut. When properly sized, it will allow normal flowrates of gas to feed the loads. On a line break, the excess flow creates sufficient differential pressure to trip the valve shut. There is only one thing that can go wrong with that approach, and therefore only one thing to test.
The keep it simple, stupid (KISS) principle has one flaw: we're not stupid.
It’s a valuable principle when designing electrical, mechanical or solid-state devices. But even the best of control systems can be defeated by human nature.
Have you ever worked in an office in which the thermostat was under lock and key? Or the wire was intentionally disconnected from it? That's a control failure.
U.S. nuclear stations are inherently safer than the one at Chernobyl. Water-moderated reactors have a negative temperature coefficient. In other words, as the water heats up it becomes less dense, some of it expanding out of the core. If it gets hot enough it becomes steam, leaving fewer molecules to reflect neutrons back into the fuel to sustain criticality. A solid, graphite-moderated core doesn't have this displacement effect, making it more likely for a runaway reaction to occur.
Four classic strategies of inherent safety are: substitute, minimize, moderate and simplify.
To these I would add another: human factors, also known as human-machine interface (HMI).
The best applications of inherent safety account for human nature. Smart workers are always applying their genius to find faster and less toilsome ways of getting jobs done. If you have a small maintenance department, which does both weld repair and fabrication of crates to ship out equipment, you'd do well to support them with a dedicated storage space for lumber that's handy to their shop. Otherwise, you'll most assuredly find the lumber and the welder in proximity at some time or another.
Fire doors are an important element of building safety. They protect stairwells and exit paths from fire exposure long enough to evacuate the building, and they compartmentalize a fire to inhibit its spread. They're equipped with automatic, spring-loaded door-closers, so they close after even the most forgetful person. But, I’ll bet each of you have seen them blocked open, either with a mechanical door stop or whatever happens to be handy. That’s because there are legitimate times when they need to be held open: during a furniture move, when carrying long objects, during cleaning operations or for temporary ventilation. I'd call the door-closer a safety feature. But it's not yet inherently safe from human nature, as they're so often defeated. An inherently safe design would employ an electromagnet: when the door is pushed fully open against it, the magnet will hold it open. But on a loss of power, or when activated by a fire system alarm, smoke detector, etc., it closes. This is inherently safe as it accommodates the natural desire to temporarily bypass its function, but restores that function automatically when needed.
I posit that windshield wipers and headlights are inherently unsafe. Even though they're required by law and their function must be regularly affirmed by state inspection by a certified mechanic, and they're indeed safety improvements, they're not inherently safe. From one model auto to another, they may be controlled on the left or the right of the steering wheel. They may be on the column or on the dashboard. They may use levers or rotary switches. It's not at all intuitive how to operate them. Thus, despite all efforts to make them available, they lack an element of inherent safety that involves the operator. Let’s put it this way: when Wyatt Earp reached for his Colt .45 to defend his life, would he wear it on the left on some days and on the right on others? Make it a single-action one day and double another? You get my drift. You may know where the toggle switch is for hazard warning lights on your car, but what about on the last rental car you drove?
Here’s another scenario: in one control room there were multiple controls on the west wall and a redundant set on the east. The construction electrician running conduit found it convenient to make straight runs between them. The result was that the operators found the controls in “A, B, C” order on the west side and “C, B, A” on the east side. That’s a mistake waiting to happen. When I reach for my .45, I don’t want to have to read labels. Spatial orientation is more important than labeling.
Inherent safety is important in all walks of life, not just complex process plants. One of my pet peeves is a slow cooker pot (Figure 1). Who in their right mind puts warm after high? This choice is akin to having your gas pedal increase speed up to a point, then pushing it further would slow it down. Not very intuitive, is it? The Italian model got it right; the U.S. one has ruined several parties because food wasn't cooked in time because it was on warm and not high.
The moral of the story is that we all have different gifts. The armored tank designer is good at making armor and mechanically reliable drive systems, but he needed the feedback of a general in the field to point out the inherent weakness in his choice of engines. Perhaps having a physical chemist on the team would have allowed this weakness to be eliminated at the design stage. The Crock Pot designer could have benefitted from the feedback of a real-life Betty Crocker.
Let’s strive to put the genius of every person to work.