Jump to content

System safety

From Wikipedia, the free encyclopedia

The system safety concept calls for a risk management strategy based on identification, analysis of hazards and application of remedial controls using a systems-based approach.[1] This is different from traditional safety strategies which rely on control of conditions and causes of an accident based either on the epidemiological analysis or as a result of investigation of individual past accidents.[2] The concept of system safety is useful in demonstrating adequacy of technologies when difficulties are faced with probabilistic risk analysis.[3] The underlying principle is one of synergy: a whole is more than sum of its parts. Systems-based approach to safety requires the application of scientific, technical and managerial skills to hazard identification, hazard analysis, and elimination, control, or management of hazards throughout the life-cycle of a system, program, project or an activity or a product.[1] "Hazop" is one of several techniques available for identification of hazards.

System approach[edit]

A system is defined as a set or group of interacting, interrelated or interdependent elements or parts, that are organized and integrated to form a collective unity or a unified whole, to achieve a common objective.[4][5] This definition lays emphasis on the interactions between the parts of a system and the external environment to perform a specific task or function in the context of an operational environment. This focus on interactions is to take a view on the expected or unexpected demands (inputs) that will be placed on the system and see whether necessary and sufficient resources are available to process the demands. These might take form of stresses. These stresses can be either expected, as part of normal operations, or unexpected, as part of unforeseen acts or conditions that produce beyond-normal (i.e., abnormal) stresses. This definition of a system, therefore, includes not only the product or the process but also the influences that the surrounding environment (including human interactions) may have on the product’s or process’s safety performance. Conversely, system safety also takes into account the effects of the system on its surrounding environment. Thus, a correct definition and management of interfaces becomes very important.[4][5] Broader definitions of a system are the hardware, software, human systems integration, procedures and training. Therefore, system safety as part of the systems engineering process should systematically address all of these domains and areas in engineering and operations in a concerted fashion to prevent, eliminate and control hazards.

A “system", therefore, has implicit as well as explicit definition of boundaries to which the systematic process of hazard identification, hazard analysis and control is applied. The system can range in complexity from a crewed spacecraft to an autonomous machine tool. The system safety concept helps the system designer(s) to model, analyse, gain awareness about, understand and eliminate the hazards, and apply controls to achieve an acceptable level of safety. Ineffective decision making in safety matters is regarded as the first step in the sequence of hazardous flow of events in the "Swiss cheese" model of accident causation.[6] Communications regarding system risk have an important role to play in correcting risk perceptions by creating, analysing and understanding information model to show what factors create and control the hazardous process.[3] For almost any system, product, or service, the most effective means of limiting product liability and accident risks is to implement an organized system safety function, beginning in the conceptual design phase and continuing through to its development, fabrication, testing, production, use and ultimate disposal. The aim of the system safety concept is to gain assurance that a system and associated functionality behaves safely and is safe to operate. This assurance is necessary. Technological advances in the past have produced positive as well as negative effects.[1]

Root cause analysis[edit]

A root cause analysis identifies the set of multiple causes that together might create a potential accident. Root cause techniques have been successfully borrowed from other disciplines and adapted to meet the needs of the system safety concept, most notably the tree structure from fault tree analysis, which was originally an engineering technique.[7] The root cause analysis techniques can be categorised into two groups: a) tree techniques, and b) check list methods. There are several root causal analysis techniques, e.g. Management Oversight and Risk Tree (MORT) analysis.[2][8][9] Others are Event and Causal Factor Analysis (ECFA), Multilinear Events Sequencing, Sequentially Timed Events Plotting Procedure, and Savannah River Plant Root Cause Analysis System.[7]

Use in other fields[edit]

Safety engineering[edit]

Safety engineering describes some methods used in nuclear and other industries. Traditional safety engineering techniques are focused on the consequences of human error and do not investigate the causes or reasons for the occurrence of human error. System safety concept can be applied to this traditional field to help identify the set of conditions for safe operation of the system. Modern and more complex systems in military and NASA with computer application and controls require functional hazard analyses and a set of detailed specifications at all levels that address safety attributes to be inherent in the design. The process following a system safety program plan, preliminary hazard analyses, functional hazard assessments and system safety assessments are to produce evidence based documentation that will drive safety systems that are certifiable and that will hold up in litigation. The primary focus of any system safety plan, hazard analysis and safety assessment is to implement a comprehensive process to systematically predict or identify the operational behavior of any safety-critical failure condition or fault condition or human error that could lead to a hazard and potential mishap. This is used to influence requirements to drive control strategies and safety attributes in the form of safety design features or safety devices to prevent, eliminate and control (mitigation) safety risk. In the distant past hazards were the focus for very simple systems, but as technology and complexity advanced in the 1970s and 1980s more modern and effective methods and techniques were invented using holistic approaches. Modern system safety is comprehensive and is risk based, requirements based, functional based and criteria based with goal structured objectives to yield engineering evidence to verify safety functionality is deterministic and acceptable risk in the intended operating environment. Software intensive systems that command, control and monitor safety-critical functions require extensive software safety analyses to influence detail design requirements, especially in more autonomous or robotic systems with little or no operator intervention. Systems of systems, such as a modern military aircraft or fighting ship with multiple parts and systems with multiple integration, sensor fusion, networking and interoperable systems will require much partnering and coordination with multiple suppliers and vendors responsible for ensuring safety is a vital attribute planned in the overall system.

Weapon system safety[edit]

Weapon System Safety is an important application of the system safety field, due to the potentially destructive effects of a system failure or malfunction. A healthy skeptical attitude towards the system, when it is at the requirements definition and drawing-board stage, by conducting functional hazard analyses, would help in learning about the factors that create hazards and mitigations that control the hazards. A rigorous process is usually formally implemented as part of systems engineering to influence the design and improve the situation before the errors and faults weaken the system defences and cause accidents.[1][2][3][4]

Typically, weapons systems pertaining to ships, land vehicles, guided missiles and aircraft differ in hazards and effects; some are inherent, such as explosives, and some are created due to the specific operating environments (as in, for example, aircraft sustaining flight). In the military aircraft industry safety-critical functions are identified and the overall design architecture of hardware, software and human systems integration are thoroughly analyzed and explicit safety requirements are derived and specified during proven hazard analysis process to establish safeguards to ensure essential functions are not lost or function correctly in a predictable manner. Conducting comprehensive hazard analyses and determining credible faults, failure conditions, contributing influences and causal factors, that can contribute to or cause hazards, are an essentially part of the systems engineering process. Explicit safety requirements must be derived, developed, implemented, and verified with objective safety evidence and ample safety documentation showing due diligence. Highly complex software intensive systems with many complex interactions affecting safety-critical functions requires extensive planning, special know-how, use of analytical tools, accurate models, modern methods and proven techniques. Prevention of mishaps is the objective.

References[edit]

  1. ^ Jump up to: a b c d Harold E. Roland; Brian Moriarty (1990). System Safety Engineering and Management. John Wiley & Sons. ISBN 0471618160.
  2. ^ Jump up to: a b c Jens Rasmussen, Annelise M. Pejtersen, L. P. Goodstein (1994). Cognitive Systems Engineering. John Wiley & Sons. ISBN 0471011983.{{cite book}}: CS1 maint: multiple names: authors list (link)
  3. ^ Jump up to: a b c Baruch Fischhoff (1995). Risk Perception and Communication Unplugged : Twenty Years of Process. Risk Analysis, Vol 15, No.2.
  4. ^ Jump up to: a b c Alexander Kossiakoff; William N.Sweet (2003). System Engineering Principles and Practice. John Wiley & Sons. ISBN 0471234435.
  5. ^ Jump up to: a b Charles S. Wasson (2006). System analysis, design and development. John Wiley & Sons. ISBN 0471393339.
  6. ^ James Reason (1990). Human Error. Ashgate. ISBN 1840141042.
  7. ^ Jump up to: a b UK Health & Safety Executive (2001). Contract Research Report 321, Root Cause Analysis, Literature review. UK HMSO. ISBN 0-717619664.
  8. ^ "The Management Oversight and Risk Tree (MORT)". International Crisis Management Association. Archived from the original on 27 September 2014. Retrieved 1 October 2014.
  9. ^ Entry for MORT on the FAA Human Factors Workbench

External links[edit]

Organisations[edit]

System safety guidance[edit]