[PREV - POWER_SATS]    [TOP]

HUMAN_ERROR


                                             May 20, 2008


   A common error people make is
   to distrust the very idea of                 It's a well-recognized
   automatic safety systems.                    cognitive bias: people
                                                like to feel like they're
   For example, they worry about whether        in control, hence they
   a nuclear power plant will really            virtually ignore the
   shut down automatically in the case          risks of errors in car
   of trouble.                                  driving, but panic
                                                about every reported
      But getting that kind of                  airline crash.
      failsafe behavior to
      work is really pretty
      trivial as engineering
      problems go.
                                                           Sept 03, 2013
      I was working at the Nuclear
      Research Facility at the INEL when         Fukushima turned out to be a
      a small earthquake hit: all of the         contrary case: the tsunami
      reactors that were running shut            was bad enough to take out
      down automatically, as they were           the emergency pumps needed to
      supposed to.                               do a shutdown.

         It was also not the problem at              How it is that this was
         Three Mile Island: the automatic            possible remains an open
         systems saw rising temperatures and         question as far as I know:
         shut the plant down multiple times.         did they know about the
                                                     vulnerability and
             The trouble was that                    procrastinate on fixing
             the human operators                     it?  Did they feel no need
             continually did                         to worry about a tsunami
             manual overrides of                     on this scale?
             the safety systems.

             They looked at the         And interestingly enough, once you
             low pressure gauge         point this out to people they tend to
             readings and assumed       feel better about the TMI accident.
             the high temperature
             readings must be bogus.                   They *like* the fact
                                                       that there were
             They were in the process                  screwups by the
             of switching to fancy new                 human operators.
             computerized temperature
             measurements, and trusted                    But this hardly
             the lower tech pressure                      let's the nuclear
             gauges more.                                 industry off the
                                                          hook: if human
                And the operators had been                beings are part
                trained that pressure and                 of the control
                temperature would always go               loop, then the
                up together: but that's                   reliability of
                *only* true if there's no gas             the whole system
                bubble in the system.  Gas is             depends on those
                springy and compressible, and             humans, and you'd
                it makes it possible for the              better have some
                pressure in the water to stay             way of making
                low when temperature goes up.             sure they *don't*
                                                          screw up.

                                                 So we need social systems
                                                 (training and management) that
                                                 create an institutionalized
                                                 commitment to safety and
                                                 ensures that the operators are
                                                 well-informed and competent.

                                                 As is often the case:
                                                 the social component
                                                 is the hard part of
                                                 technical problems.
                             TECHIES_FALLACY
                                                   But then, in the
                                                   case of TMI the
                                                   problem was largely
                                                   self-healing.

                                                      Afterwards no one would
                                                      would dare shrug "oh,
                                                      it's probably a false
                                                      alarm."

                                                                 Nothing like
                                                                 trashing a
                                                                 plant to
                                                                 convince
                                                                 that it's a
                                                                 good idea to
                                                                 take every
                                                                 warning
                                                                 seriously.


--------
[NEXT - WHOS_WE]