http://iet.metastore.ingenta.com
1887

Extending the application of formal methods to analyse human error and system failure during accident investigations

Extending the application of formal methods to analyse human error and system failure during accident investigations

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Software Engineering Journal — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Recent disasters at Bhopal, Chernobyl, Habsheim and Kegworth illustrate the point that software is rarely the sole cause behind major accidents. Operator intervention, hardware faults, even the weather conditions and malicious acts all combine to create the conditions for failure. In the aftermath of these accidents, it seems difficult for software engineers, systems developers, forensic scientists and interface designers to predict all of the ways in which systems can fail. It is therefore important that we learn as much as possible from those failures that do occur. Unfortunately, it is often difficult to gain a coherent overview from the mass of detail that is typically contained in many accident reports. This makes it difficult for readers to identify the ‘catastrophic’ events that produced the necessary conditions for disaster. The paper argues that formal specification techniques can be used to resolve these problems. In particular, Temporal Logic of Actions is used to build a unified account of the human errors and system failures that contributed to the Three Mile Island accident. This notation provides high-level abstractions that can be used to strip away the mass of irrelevant details that often obscures important events during disasters. Formal proof techniques can then be applied to the model as a means of identifying the causal relationships that must be broken in order to prevent future failures.

References

    1. 1)
      • , Safeware: system safety and computers
    2. 2)
      • Woods, D., Johansen, L., Cook, R., Sarter, N.: `Behind human error: cognitive systems, computers and hindsight', 94-01, Technical Report, 1994, CSERIAC State of the Art Report
    3. 3)
      • : BAC 1-11 G-AYWB and Boeing 737 EI-BTZ, Report on the incident involving, 12 April 1988, 1989 (Her Majesty's Stationery Office)
    4. 4)
      • The formal analysis of human-compuer interaction during accidents investigations, People and computers IX
    5. 5)
      • The application of Petri nets to represent and reason about human factors problems during accident analyses, The second Eurographics workshop on the design, specification and verification of interactive systems
    6. 6)
      • Safety-critical systems, formal methods and standards
    7. 7)
      • , Formal methods in human computer interaction
    8. 8)
      • , Formal methods for interactive systems
    9. 9)
      • Specification, refinement and analysis of interactive processes, Formal methods in human computer interaction
    10. 10)
      • Using a formal language to support natural language in accident reports
    11. 11)
      • The story of the Therac-25 in LOTOS
    12. 12)
      • A proof of incorrectness using LP: the editing problem in the Therac-25
    13. 13)
      • Techniques for the evaluation of design notations, Design, specification and verification of interactive systems '96
    14. 14)
      • Representing and reasoning about the impact of environmental layout upon human computer interaction
    15. 15)
      • Lamport, L.: `The temporal logic of actions', 79, Research Report, December 1991
    16. 16)
      • Verification of concurrent programs: the temporal framework, The correctness problem in computer science
    17. 17)
      • Ladkin, P.: `Reasons and causes', Technical report, 1996, (available via http://www.techfak.uni-bielefeld.de$̃ladkin)
    18. 18)
      • , Power production: what are the risks
    19. 19)
      • , Understanding systems failures
    20. 20)
      • , Human error
    21. 21)
      • , Normal accidents: living with high-risk technologies
    22. 22)
      • Telford, A., Johnson, C.W.: `Viewpoints for accident analysis', Technical report, 1996
    23. 23)
      • Johnson, C.W., Telford, A.: `The formal analysis of human factors errors and system faiilures', TR-1996-6, Technical Report, 1996
    24. 24)
      • The Viewpoints FAQ
http://iet.metastore.ingenta.com/content/journals/10.1049/sej.1996.0046
Loading

Related content

content/journals/10.1049/sej.1996.0046
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address