What is it about?

Organizations, particularly those for whom safety and reliability are crucial, develop routines to protect them from failure. But even highly reliable organizations are not immune to disaster and prolonged periods of safe operation are punctuated by occasional catastrophes, a phenomenon that has been labelled the “paradox of almost totally safe systems”, which expresses the idea that systems that are very safe under normal conditions may be vulnerable under unusual ones. In this paper, we analyse the loss of Air France 447. We show that an initial, relatively minor limit violation (a brief loss of speed indications) set in train a cascade of human and technological limit violations, with catastrophic consequences. Focusing on cockpit automation, we argue that the same measures that make a system safe and predictable also restrict cognition, which over time, inhibits and erodes the disturbance handling capability of a system’s operators. We also note limits to cognition in design processes that make it difficult to foresee complex interactions, thereby creating system vulnerabilities. We discuss the implications of our findings for predictability and control in other contexts and explore ways in which these problems might be addressed.

Featured Image

Read the Original

This page is a summary of: Cognition, Technology, and Organizational Limits: Lessons from the Air France 447 Disaster, Organization Science, June 2017, INFORMS,
DOI: 10.1287/orsc.2017.1138.
You can read the full text:

Read

Contributors

The following have contributed to this page