A presentation at Hacktivity in in Budapest, Hungary by Kelly Shortridge
PDF link: https://swagitda.com/speaking/To-Err-is-Human-Kelly-Shortridge-Hacktivity-Keynote-2019.pdf
Humans make mistakes. Information security’s mistake is operating as if humans can be forced to never err. The illusion of “human error” being a satisfactory explanation for security failures holds us back, constricting our feedback loops and creating blind spots. We will never achieve our goal of securing complex systems if we do not analyze problems through a systemic lens rather than childishly pointing fingers.
In this talk, we will explore what we mean by “error” as well as the hindsight and outcome biases that constrain our perspective. Then, we will discuss how infosec tends to cope with failure, from blaming humans to implementing rigid procedures. Finally, we will conclude by delving into solutions infosec can implement to help our organizations better cope with failure, including the adoption of a systems perspective, chaos security engineering, and blameless culture.
The following resources were mentioned during the presentation or are useful additional information.
PDF of the slides
Mahdavi, S., & Rahimian, M. A. (2017, August). Hindsight bias impedes learning. In Imperfect Decision Makers: Admitting Real-World Rationality (pp. 111-127).
Baron, J., & Hershey, J. C. (1988). Outcome bias in decision evaluation. Journal of personality and social psychology, 54(4), 569.
Holden, R. J. (2009). People or systems? To blame is human. The fix is to engineer. Professional safety, 54(12), 34.
Woods, D. D., Dekker, S., Cook, R., Johannesen, L., & Sarter, N. (2017). Behind human error. CRC Press.
Henriksen, K., Dayton, E., Keyes, M. A., Carayon, P., & Hughes, R. (2008). Understanding adverse events: a human factors framework. In Patient safety and quality: An evidence-based handbook for nurses. Agency for Healthcare Research and Quality (US).
Reason, J. (1990). Human error. Cambridge university press.
Dekker, S. (2002). Punishing People or Learning from Failure? The choice is ours.
Galison, P. (2000). An accident of history. In Atmospheric flight in the twentieth century (pp. 3-43). Springer, Dordrecht.
Vicente, K. J. (2013). The human factor: Revolutionizing the way people live with technology. Routledge.
Dekker, S. W. (2003). Accidents are normal and human error does not exist: a new look at the creation of occupational safety. International journal of occupational safety and ergonomics, 9(2), 211-218.
Leveson, N. (2011). Engineering a safer world: Systems thinking applied to safety. MIT press.
Cook, R., & Rasmussen, J. (2005). “Going solid”: a model of system dynamics and consequences for patient safety. BMJ Quality & Safety, 14(2), 130-134.
Kohlenberg, T. (2017). Red teaming probably isn’t for you.
Thaler, R. H., Sunstein, C. R., & Balz, J. P. (2010). Choice Architecture (SSRN Scholarly Paper No. ID 1583509). Rochester, NY: Social Science Research Network.
Karsh, B. T., Holden, R. J., Alper, S. J., & Or, C. K. L. (2006). A human factors engineering paradigm for patient safety: designing to support the performance of the healthcare professional. BMJ Quality & Safety, 15(suppl 1), i59-i65.
Shortridge, K. (2018). The Red Pill of Resilience.
Forsgren, N., Shortridge, K. (2019). Controlled Chaos: the Inevitable Marriage of DevOps and Security.
Allspaw, J. (2012). Blameless PostMortems and a Just Culture.
Stratton, M. (2019). Avengers Assemble: The Thanos Incident.
Allspaw, J. (2016). Etsy’s Debriefing Facilitation Guide for Blameless Postmortems.
Here’s what was said about this presentation on social media.