By Pat L. Clemens, PE, CSP and Warner W. Talso
[Editor’s note: This editorial originally appeared in Vol 31 Issue 1 of Hazard Prevention (now Journal of System Safety) in 1995. It has been reformatted from the original, but the text is otherwise unchanged.]
IN 1982, PAT CLEMENS wrote an article for Hazard Prevention that discussed several safety analysis techniques and documented 25 of them in a standardized format. This was the genesis of the recently published System Safety Analysis Handbook. It seemed appropriate to ask Pat to write the Foreword for the Handbook. This article is based on that Foreword and is adapted for Hazard Prevention by Warner Talso. Pat has presented a unique philosophical overview of system safety. The message is sufficiently timely and thought provoking us to merit this additional dissemination.
JUST WHAT IS SYSTEM SAFETY?
Many authors have offered definitions of system safety. There has been disappointingly little agreement among them. Even the military standard devoted to system safety has varied from one revision to another in defining just what system safety is. There is, in fact, disagreement as to whether the discipline is properly called “system safety” or “systems safety.” (The former has earned the greater favor among today’s more knowledgeable practitioners.)
Among those who practice system safety, dispute also surrounds definitions of even the most basic of concepts that are imbedded within the discipline – such elemental concepts, for example, as what constitutes a hazard, and what risk is. These inconveniences detract fom orderliness in the practice of system safety. Moreover, they impede the universality of understanding necessary for communication within the community of practitioners. Because system safety is a relatively new discipline, there are some who blame these problems on its youth. It’s true enough, after all, that in the more mature disciplines such problems were long ago put to rest. There is not likely a single mechanical engineer who would quarrel with the well-known and universally understood handbook definitions for mechanical stress or hydraulic pressure.
Youth alone, or age, may not be the explanation however. Mechanical engineering, after all, is a science-based discipline whose fundamental principles rest solely upon the physical laws of nature and on applying those laws to the solution of practical problems. This is not the case for system safety. There are no closed-form solutions available even to its most fundamental process – that of hazard discovery. And what subjective assessment of risk is wholly free of emotional bias? In system safety, one finds rich elements of philosophy interwoven with art (… and, sometimes, with guile!).
Despite uncertainty as to how to define system safety and its imbedded concepts, there is much less dispute that system safety may be described as incorporating both a doctrine of management practice and a collection of analytical methods which support practicing that management doctrine. It is the analytical methods rather than the management doctrine that are dealt with in this compendium. Those analytical methods have proliferated quite remarkably over the past two decades, and more is to be said of that below. Of the textbooks that are devoted to system safety, many treat no more than a few of the analytical methods. None treat them all. Thus, the practitioner has no single, comprehensive source book to which to turn for descriptions of them all. It is the principal purpose of this Handbook to overcome that shortcoming. Yet even this Handbook will not have succeeded at that purpose – at least, not for very long. At any moment we might choose, someone, somewhere, for better or for worse, is developing or modifying or improvising a new, “best-yet” system safety analytical method. More is to be said later on that issue, as well.
THOSE ANALYTICAL METHODS – TYPES OR TECHNIQUES?
It is too little recognized that, of the approaches to “doing” system safety, some of the so-called analytical methods are types of analysis rather than true analytical techniques. More than trivial wordplay is involved in drawing this distinction. The techniques address the how of carrying out the analysis. The types of analysis address the where, the when, or the what it is that gets analyzed. Thus, a Subsystem Hazard Analysis is a type. It deals with analysis (by whatever technique) at the subsystem level – i.e., where, rather than how. And as a type, Subsystem Hazard Analysis can be supported by applying any of a number of the techniques. Conversely, Failure Modes and Effects Analysis is a technique of analysis – a how -and it is applicable at many system levels varying, for example, from the subsystem level down to the parts-count level.
With few exceptions, the analytical techniques are divisible into two major sets: those that rely on a hazard inventory approach (e.g., Preliminary Hazard Analysis, Failure Modes and Effects Analysis) and those that employ symbolic logic to produce a conceptual model of system behavior (e.g., Event Tree Analysis, Cause-Consequence Analysis). Some authors think of the inventory techniques as inductive, whereas the modeling techniques are deductive. And, it is worth noting, many of the techniques are simply derivatives of others. Fault Hazard Analysis, for example, is indistinguishable in its basic methodology from Failure Modes and Effects Analysis, though many would argue otherwise with an inexplicable vehemence that both marks and mars this field.
THE CHOICE OF METHOD – AVOIDING FAILURE AT THE OUTSET
In the rush to “do system safety,” it is too often the case that inadequate regard is given to the important business of selecting, with rational care, the particular analytical method to be used – whether that method might be a type or a technique. Methods are selected on the basis of their current popularity, their fancied potency, or the affection developed for them by the individual practitioner, rather than on the basis of their worth at dealing meaningfully with the real technical issues at hand. Recourse to a well-constructed compendium of methods is a way of rendering this pitfall less perilous. This Handbook is such a compendium.

SUPERMETHOD LIVES ON!
The search for the ideal system safety analytical method moves inexorably onward! The notion will not die that there must exist, somewhere out there, one particular analytical approach that is overwhelmingly superior to all of the others. That notion will not die as long as there are charlatan, and shallow thinkers to perpetuate the myth. Of the many analytical techniques, each has its advantages and its shortcomings. Each has more or less virtue in some applications than in others. Recourse to a dispassionate, annotated compendium can help to guide in selecting the technique(s) for a specific application, but it can only help. Again, this Handbook is such a compendium.
No SUPERMETHOD? – WELL, SHUCKS… INVENT ONE!
Just as the search among existing analytical methods for the ideal one does not end, neither does the quest to invent the cosmically universal technique go unpursued. Even as physics struggles to develop a Unified Field Theory, system safety practice seeks to produce an umbrella-style approach to which all system safety problems will succumb. Indeed, the latest, ultimate, tine-size-fits-all analytical method is brought to us in the technical periodicals several times each year. (Often, these ultimate methods are given clever names that spell out catchy acronyms, and usually the papers that describe them have been given no benefit of sound technical review by peer practitioners.) The result has been a proliferation of Swiss-Army-knife-style system safety approaches that enjoy starburst popularity, then are seen no more.
Just as it was without recognized success that the van Gigch Applied General System Theory sought to provide an absolute system approach for any issue, so also have all such attempts in system safety practice failed, and largely for the same reasons. Students of operations research – a very responsible group too little heard from among system safety practitioners -are quick to point out that the variability of systems and the permutations of failure opportunities within systems make analyses of those failure opportunities intractable by a single analytical approach. It’s a rare Swiss Army knife, after all, that has both a bumper jack and a lobotomy kit in its inventory of tools. However, undaunted by the physical truth of the matter, we find among us those who do continue to develop equipment to measure the flatness of the earth with ever greater precision.
WHERE LIES LEADERSHIP?
The Baton of Excellence in system safety practice has passed from one domain of application to another during the past few decades. Among those who employ system safety methods to gauge and to control risk, the chief impetus for generating roles of leadership has been the perceived urgency of need. That element – the perceived urgency of need – was found in U.S. Air Force circles in the epoch of Minuteman development. From there, excellence in system safety practice became a DOD-wide imperative, propelled by the combined realization of success in the Air Force experience and the coming of ever more intense sources of energy under control by ever more complex systems. It moved next to nuclear power circles with the requirement for probabilistic risk assessment to support the licensing of reactor plants during the period when nuclear power seemed to offer hope of energy salvation. And from there, it came to be shared with NASA as manned rocketry had its beginnings and quantitative methods were considered necessary to ensure that risk was under appropriate control. It was there, during the Apollo days, that numerical probabilistic methods began to provide unpopular gloomy results, and NASA returned to less elegant, subjective approaches.
Now, in the post-Bhopal era and perhaps accelerated by the downsizing of the DOD, the Baton of Excellence in system safety practice has most assuredly moved into the private- sector chemical processing industry A recent OSHA standard, CFR 1910.119, will ensure that it remains there for a time – until it is seized upon by yet another field of endeavor in response to a Bhopal-like catastrophe in some other venue.
COUNTERING MALPRACTICE
Abuses abound in the contemporary practice of system safety This is the case, largely, because much of current system safety practice rests upon “art form.” There are few if any exact solutions to problems in system safety, no matter how well the practice might be supported by sophisticated analytical trappings. Art permeates the practice. Art invites constructive innovation, of course. But art rests upon subjective judgment rather than logic. Thus, art admits abuse. In addition to the opportunity for abuse that is provided by the art-nature of system safety practice, there is insidious motivation for abuse as well. Consider the end use to which the results of system safety analyses are put. Systems are analyzed as to their hazards, and those hazards are assessed as to their risks for a single reason: to support the making of management decisions. Management must decide if system risk is acceptable or if it is not. And, if risk is not acceptable, then management must decide what is to be done, and by whom, and by when, and at what cost.
Management decisions, too, are often arrived at through the raw practice of art. And they are decisions in which very serious interests are vested monetary interests, ethical interests, emotional interests, legal interests, interests involving reputation. Decisions favoring one of these interests are often in conflict with decisions favoring another. Thus, the making of management decisions is an art form that tempts introducing subtle elements of personal bias – bias to favor a particular interest. If one wishes to exercise that bias and to endow its exercise with the appearance of legitimacy, how might the wish be accommodated? Why not twist gently at the arm of another art form? Lean a bit upon the vagaries of system safety. In management practice, the distinction between a desired analytical result and one objectively reached can be indistinguishably blurred, the more so if the analytical result is one that rests on the exercise of an art rather than an exact science.
And so, abuses do abound in the practice of system safety. Some of those abuses arise out of ignorance, and some out of malevolence. And so, perhaps, it may always remain. But the abuses can be lessened in their prevalence and in their sinister character. They can be lessened through the dissemination of well ordered information, assembled by reputable practitioners, and effectively describing the elements of responsible practice. That is the principal purpose of the System Safety Analysis Handbook.
ABOUT THE AUTHORS
Pat L. Clemens, PE, CSP
Mr. Clemens is the Corporate Safety Manager for Svedrup Technology, Inc. He is a Director on the Executive Council of the System Safety Society. His experience includes being Past President of the Board of Certified Safety Professionals, a recipient of the System Safety Society Educator of the Year Award, and recipient of the IEEE Centennial Award
Warner W. Talso
Mr. Talso is a co-editor of the Safety Analysis Handbook published by the System Safety Society. He is currently a Safety Analyst for M. H. Chew C Associates working with the Department of Energy. He is a Lead Nuclear Quality Assurance Auditor, a Past President of the New Mexico Chapter of the System Safety Society, and a Past President of the New Mexico Chapter, Society of Logistics Engineers. He is the 1994 recipient of the System Safety Society Educator of the Year Award
More perspectives on system safety:

[…] A Perspective On System Safety […]