Select Page

Cracking the Code: The Human Factors Behind Organizational Failures with Martin Anderson

Cracking the Code The Human Factors Behind Organizational Failures

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

You don’t want to miss our latest episode of ‘Cracking the Code: The Human Factors Behind Organizational Failures’ on The Safety Guru. Join us as Martin Anderson, a renowned expert on human factors and performance, shares his valuable insights and examples about the human factors behind organizational failures. Learn how to effectively and constructively embed lessons learned in your organization.

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost, for the C suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi, and welcome to The Safety Guru. Today I’m very excited to have with me Martin Anderson, who’s a human factors expert. We’re going to have a really interesting series of topics of conversation today. He’s got a deep background in human factors across oil and gas regulatory environments. His passion is really to understand how people perform in complex systems and also, ultimately, why organizations fail. So, Martin, welcome to the show. Really excited to have you with me. Let’s get started with a bit of an introduction.

Yeah, thank you very much, Eric, and certainly, thank you for having me on the show. It’s a real privilege to be invited here. Yeah, so in terms of my background, I started off with a psychology degree, and then I did a master’s in human factors. And after a few years of work experience, I followed that up with a Master’s in Process Safety and Loss Prevention. I’ve been a human factors specialist for over 30 years now. I’ve worked for a couple of boutique consultancies. I’ve been a regulator working as a specialist inspector in human factors for the UK Health and Safety Executive. I spent a few years as a human factors manager in an oil and gas company. I spent a lot of time assessing existing installations but also had input into the design of new facilities, working on 40, 50-billion-dollar mega projects. And over that time, I visited over 150 different oil, gas, and chemical facilities, both onshore and offshore, which gave me quite an insight into how some of these major organizations operate. And one of the reasons I created the website humanfactors101.com, was to share some of those insights. The other thing I’d like to talk about is going back 30 years, right to the start of my career.

I read a document which was called Organizing for Safety. It was published by the UK Health and Safety Executive in 1993. There’s a quote from that document I would like to read out because it had a huge impact on me at that point. It goes like this, different organizations doing similar work are known to have different safety records, and certain specific factors in the organization are related to safety. So, if we unpack that quote, it really contains two statements. First of all, the different companies doing the same things have got different safety records. And secondly, perhaps more importantly, there are specific factors that could explain this difference in safety performance. And I thought this was amazing. I thought if these factors could be identified and managed, then this safety could be massively improved. And over the next 30 years or so, one disaster at a time, these organizational factors have revealed themselves in major incidents, which I guess we’ll come to in a moment.

I think that’s a great topic to get into. So why do organizations fail? Because I think when we had the original conversations, I was fascinated by some of your connections between multiple different industries and common themes that were across all of them.

Yeah, sure. What might be helpful, first of all, because we introduced me as a human factors specialist to just briefly define what we mean by human factors, and then we’ll go into looking at some of the organizational incidents if that’s okay. Sure. For me, Human Factors is composed of three main things. We’re really looking at, first of all, what people are being asked to do. That’s the work they’re doing. Secondly, who is doing it? This is all about the people. And thirdly, where are they actually working? Which is the organization? So ideally, all three of these aspects need to be considered, the work, the people, and the organization. But my experience is that companies tend to focus on just one or two of these, usually the people one. Within the UK HCC, our team defined human factors as a set of 10 topics, which has become widely known as the top 10 used by industry consultants and regulators worldwide. Because prior to that, we would turn up to do an inspection, say, we’re here to inspect your human factors. And they were like, I don’t know what you mean. How do we prepare for that?

Whom do you want to speak to? What do you want to go and look at? So, after creating that top 10, we were able to say, the agenda for the inspection is that we want to come and look at how you manage fatigue. We want to come and look at your supervision arrangements or your competency assurance system. So, this helped to operationalize human factors. So, the other description, really, of human factors. A lot of people come to human factors through human error. They hear about human error. But if we identify human error, we need to understand how and why it occurred and not simply blame people. Are we setting people up to succeed? Are we setting them up to fail? Are we providing systems, equipment, and an environment that supports people to do the work that we’re asking them to do? And to introduce, as we move towards talking about organizational failures, I’d like to read a quote from Professor James Reason, who is a psychologist at the University of Manchester. And this quote is about 25 years old, but it’s still one of my favorites. And Reason said that rather than being the main instigators of an accident, operators tend to be the inheritors of system defects created by poor design, incorrect installation, faulty maintenance, and bad management decisions.

Their part is usually that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking. And I think that’s a really good introduction to our discussion on organizational failures.

So, let’s go there because we had a really interesting conversation on organizational failures and some of the common themes. So, what are some of the common themes, and why do organizations fail?

Exactly. When you say, why do organizations fail? Let’s just think about a few of those from different industries because these organizational disasters have occurred to the NASA space shuttles, the Harold of Free enterprise, Ferry Disaster, Shenandoah, the Kings Cross Fire, Piper Alpha, Caterpillar, Texas City, Burnsville, Deepwater Horizon, the Condo, lots of different rail incidents around the world, several so-called friendly fire events. And there’s also been organizational disasters in sectors such as healthcare and finance. In the UK, these include inadequate care during children’s heart surgery at the Bristol Royal Infirmary over a 10-year period. And, of course, most listeners will be familiar with the so-called rogue trader that caused the collapse of Bearings Bank. So, there were so many disasters in so many different industries. And I know when we had a conversation earlier, what we were considering was that, okay, they’re all in different industries, but there are lots of common themes that we could pull out of those from space shuttles to Bearings Bank, for instance.

So, what are some of the themes? Because I think the part that really caught my attention is I think you’ve done an activity where you had taken the facts from a different event, mastered it, and told me a little bit about that story in terms of how you mastered the facts that were from an existing element and people thought it was something different.

Yeah. So, the example there was that… I don’t know if readers are familiar with the Nimrod disaster. So, this goes back to 2006. Nimrod was a recollonance aircraft. And shortly after air-to-air refueling, it was on a routine mission over Afghanistan. Shortly after that refueling, there was a fire which led to the loss of the aircraft and, sadly, the 14th service personnel. And I was asked to get involved and advise that investigation. And as I started to read some of the initial information from that investigation, I started to think, this sounded just like another incident I’m really familiar with, which was one of the shuttle incidents the Columbia incident. So I put a presentation together, and on one side of the slide, I put the information from the Nimrod incident, and on the right-hand side of the slide, I put information from the Columbia incident. And then, I went through several of the issues that were involved, and I produced this PowerPoint presentation, and I mixed up the left and right sides, and I didn’t say which was in which. And when we showed it to the investigation team, they couldn’t determine which information came from the incident they were investigating from the NIMOD incident and which information came from the shuttle Columbia incident many years previously. 

It just showed you the two very different incidents in different industries, different locations, and different people, that the organizational issues were almost identical. That was quite powerful, the fact that people couldn’t tell the difference between the facts from one and the facts from the other because these causes just overlap so much. When you look at the very detailed technical level, there are differences between these events. But the common factors when you really start looking at the deeper or, the broader organizational issues, then there are so much many similarities.

What are some of the themes in general that you’ve looked at? You mentioned Bearing’s Bank, which sounds very different than Piper Alpha. What are some of the common themes?

It does. You think, what has the failure of a 100-year-old bank got to do with the failure of an oil refinery or an offshore oil platform or any of the other incidents that we’ve spoken about? People and organizations fail in very similar ways. The findings from these disasters are getting quite repetitive just because you’re seeing the same things over and over. When you look at all of these incidents and pull out some of the main themes, what are the things that we’re seeing? Because the important thing is that we can go and look for these in an existing organization. You see things like a lot of outsourcing to contractors without proper oversight. We call that in the nuclear industry, we call that not having intelligent customer capability because they don’t know what the contractors are doing. They can’t explain what the contracts are doing. Then you’ve got inappropriate targets or priorities or pressures because, in almost all of these cases, there were significant production pressures, whatever production means for your organization. Another key issue that you see almost every time is a failure to manage organizational change. And by that, I mean a failure to consider the impact of that organizational change on safety.

So, a lot of organizations are going through almost like a tsunami of changes and not really considering how that impacts how they manage safety or not considering that each of those separate changes has a cumulative effect which is more powerful than the individual changes. You also see a lot of assumptions that things are safe. So even if you have evidence to the contrary, assuming that everything is safe, rather than going and looking for information, rather than challenging, or rather than having a questioning attitude, organizations are pretty bad at looking for bad news or responding to bad news, not wanting to hear bad news. So in almost all of the incidents that we’ve spoken about, it wasn’t a complete surprise to everybody in the organization. There were people in the organization that knew things were going wrong, that they were getting close to the boundaries of safety, but they couldn’t either get that information to be heard by the right people, or people didn’t react or respond to that. So it’s really interesting when you look, and you read the detailed investigation reports, and there are always people that knew that things were going wrong. So that information is available in the organization.

And I think that’s a good thing because that means that, hey, this is good. We can proactively do something about this. We can go and look for some of these things. So the things that I mentioned there, and there are a lot more, Eric, that we could talk about. There are lots of organizational issues we could proactively go and look for because these incidents are devastating for the people involved, for the organizations involved, but they’re a free lesson for everybody else. Sure.

If you choose to learn from them and if you choose to see the analogy between a space shuttle, Nimrod, and Barings Bank, and whatever industry you’re in.

Yeah, exactly. Because you have to go looking for those issues, for those factors in your organization, so, there are two things or maybe three things you mentioned there. So, you need to go looking at other incidents. You need to take the lessons from those. You need to go and look for them in your organization, and you need to act on that. So, this failure to learn from other industries, for me, is perhaps the greatest organizational failure of all. The organizations think, well, it doesn’t apply to me because that was in a children’s hospital, or that was a bank, or that was an offshore platform. What’s that got to do with me in my industry? Failure to learn those lessons is the biggest failure because you can get away from the technical specifics of the incident and just try and look at the deeper organizational issues. But who in organizations is doing this, Eric? Which person, which role, which part of the organization goes looking for these events and draws the lessons and then goes and challenges their own organization? It’s actually quite difficult to do that. It’s like the problem with safety, isn’t it? Really, is that you can go into a boardroom and you can pitch a new product to a new market, and people give you money, and they’ll listen to you.

But if you go in and pitch that you want to spend money to protect and safeguard the installation against things that may or may not happen in the future is a much harder sell. It’s a problem for safety more generally.

One of the things I know we talked about was around what you call organizational learning disability, so people are good at investigating, but not true learning, and not embedding the change. I’ve seen this many times where people learn the same lesson over and over.

And that’s it. When we have these large investigations into these disasters, there’s always this proclamation that this must never happen again, and we need to learn the lessons. And then something else happens a year or two later in a different industry, but the same issues. So, you talked about a learning disability. Why do organizations fail to learn? Given that, there’s this wealth of information out there available as to why organizations fail. For me, I think there are two issues. I think there’s this failure to learn from other industries. All industries think they’re unique. They don’t think that they can learn because it’s a totally different industry. It’s nothing to do with them. But they all employ the same kinds of people. There aren’t different people working in different industries. They all employ the same people. They organize themselves in very similar ways, and they have the same targets and priorities and so on. So, first of all, that assumption doesn’t apply to me. It’s a different sector. So, failure to learn from other industries, we’ve spoken about, but failure to learn from your own investigations. And we see this in major incidents like NASA failing to learn from the previous incidents it had.

So, you have the Mars orbital and failure to learn from that. You have Challenger, then Columbia, and so on. So, what we find is that there’s a lot of sharing but not enough learning. So, after an incident, then there’s a safety bulletin put together, it goes on the intranet, there might be a bit of a roll, and so on. But you’re not, actually… If you’re not changing something, you’re not learning. So, something in the organization has to change for a lesson to be embedded. And you need to go back and confirm that you’ve changed the right thing. So, you can’t just change something and assume everything will be okay. So if you’re not changing anything structurally in the organization or in one of the systems or one of the processes, then you’re not embedding the learning. So that’s the first thing is this failure to embed the lessons that you come up with. I think the other problem is that investment derogations are not always of great quality. They’re not identifying the right issues. They may not be getting to the root causes. They might focus on human error. They might focus on blame. And Investigations that are done by external bodies generally are starting to look at these organizational issues.

But investigations that are done internally by the organizations themselves into their own events rarely confront organizational failures. It’s very challenging for the investigation team to raise issues that suggest there are failures at the leadership level. It’s challenging for the investigation team, and it’s challenging for the leadership to receive that information. So quite often, the recommendations and the actions are all aimed at employees, a bit like a lot of safety initiatives, behavioral safety, safety culture, and so on, are quite often aimed at the front-line workforce rather than the whole organization. We often see that in investigations as well if they’re not challenging these organizational issues, whether that’s because of a lack of understanding or whether or not that’s not accepted by senior leadership. Because people doing these investigations aren’t always competent. And I mean that in the nicest possible way. They don’t have the right experience, or they’re not given enough time, or it’s seen as a development opportunity. So, investigations need to have the right people doing them, asking the right questions in order to get the right recommendations out of them. Because if the process isn’t right, you’re not going to get the right recommendations coming out of it.

So, what are you going to learn because you haven’t got to the real issues? So yeah, I think there are two issues there, failure to learn from other industries, but also failure to learn from your own investigations. And we can talk about some tips that maybe could help organizations get to some of those organizational issues when they’re doing investigations. Absolutely. And also, it’d be useful to talk about how you can go and look for some of these organizational issues before you actually have an incident, which is what we want to get to. We want to have it, we want to learn, but we don’t want to have incidents in order to be able to learn. So why can’t we learn proactively without having an incident in the first place?

This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety, and safety to your advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com.

Let’s start first in terms of how you can identify some of these organizational factors through the investigation process.

Through that investigation process, what you’re really trying to do to get to the organizational issues is you’re trying to zoom out from the detail, taking a helicopter view. You’re zooming out and looking down, trying to see this bigger picture. So, for example, most people who’ve done an investigation would have put together a timeline. So, a list of what happened to who or what equipment and when and draw a timeline and start to map what happened. But the problem is that a lot of those timelines start on the day of the event. And what I’d propose is that your timeline goes back to weeks, months, or even years before the event occurred. You’re trying to identify what might have changed in the organization in that period in terms of changes to equipment, processes, people, priorities, the direction the company was going, and so on. So, your timeline needs to go way back because of the organizational issues that we see in all of these events. These events didn’t just occur overnight. As Reason said in that quote, there was trouble brewing for weeks, months, and years beforehand. So, there are indications in the organization. So, your timeline needs to go back and look for those issues.

That automatically forces you to think not just about the actual incident but more widely about your organization. The other thing you can do really is review previous incidents that have occurred or other sources of data, maybe looking at audits or regulatory inspections, or staff surveys. You’re trying to identify common threads and trends, and you’re trying to identify how long these conditions have existed and how extensive they are across the company. Why did this event surprise us? Because, as I say, the information is normally available in the organization. So why did this come as a surprise? You’re looking not just at individuals, but you should be looking at systems. You should be looking at processes, and your mindset as an investigator should be thinking about what were the organizational conditions. What was the context in the organization that set people up to fail? So that going back way before the incident is quite a helpful change of mindset for people, rather than just going, okay, what happened on this day? And thinking about how you responded to the incident. It’s quite a useful tool to help you think more about organizational issues.

And how broad do you go? Because when you start going back to Zoom out years before decisions, changes in leadership, changes in investment, you can open up a very big can of worms. And I see if it’s Deep-Water Horizon, Piper Alpha, that there’s a need to go deeper. But how deep and how wide do you cast the net? Because I think it’s incredibly important like you said. Otherwise, you just limit to that person that made a mistake as opposed to start understanding what’s changed in the environment, the context. Sure.

It’s a lot easier in those big disasters to do that because they’ll have a huge team of people in these investigations. Some of them have taken five, six, eight years. They have the time and the resource. In an organization, you generally don’t have that much time to do an investigation. Quite often, the people doing it have other jobs, so they want to get back to the day job. So, it’s one of the reasons why the investigations are quite compressed in terms of time because most people are not full-time investigators. So, I think what you can do is it depends on the incident that you’ve had as to how far you want to go back. But I think looking at whether or not those conditions exist in other facilities or workplaces is a useful step that can really help you identify whether this is unique to this scenario or is this a systemic issue that we have in our organization. Organization. I think going back and looking at what might be key issues, so if you’ve had a merger or an acquisition or a major change in your direction or a new product or you’ve opened a new facility, those major organizational changes, if you had a downsizing exercise two years ago and since then there’s obviously been issues in terms of staffing and resources, then those are the key things you need to need to be mapping out.

As you say, you can’t map everything, but you’re looking for key significant changes or events or shifts in priorities or policies that might have occurred in the previous years. And I guess the time and effort that you spend in that partly depends on the consequences or the potential consequences of the event that you’re looking at.

But there’s still an element of you can focus the conversations like you just said in terms of what are the major shifts that happen as opposed to unearthing every piece. You’re still rewinding the movie further back. The other part I think is interesting to explore is what you talked about in terms of how we know and explore some of these organizational factors before something happens. And you mentioned that in all the incidents, you talked about somebody who knew something was up before. So how do we identify these themes before a major event?

Yeah, you’re right there, Eric. I think there’s always information available, and it’s just maybe not getting to the right people, or people aren’t taking action on it. So, these warning signs, red flags, whatever you want to call them, they’re unnoticed, they’re ignored or not getting to the right person because, as we’ve said, these incidents incubate over a long period of time. Those warnings accumulate. And that’s a great thing because that means that we have an opportunity to go and look for them and to find them. So, if you start looking, first of all, you should have a means for people to be able to raise those concerns in an independent, confidential way, some reporting system so that those concerns are coming to you. So that’s like one mechanism is some industries are much better than others at having confidential reporting systems where people can safely report a near miss or an error or challenge or frustration that they’re having. And that gives the organization an opportunity to do something about it. You’ve got to have the right culture for that, of course, because if your previous investigations blame individuals, then people are not going to come forward because they’ve seen what’s happened to other people.

So, they’re going to keep quiet, and these things get brushed under the carpet. So, it does depend on the culture that you’ve got. But having an independent, confidential way for people to raise those issues can be quite useful. So that allows issues to come to you. But you also need to go looking for these issues as well.

Yeah, I think.

That’s important. Organizations have had quite a few events. So, do they investigate them individually, or do they try and join the dots between different incidents? They might appear unrelated, but are they? Are you starting to accept things, either conditions or behaviors, that you wouldn’t have accepted a few years ago? People’s risk acceptance might change over time. Are you contracting more out? And do you really understand the technical work that those contractors are doing? Can you explain it? Can you challenge it if necessary? Are you having lots of budget cuts? The conversation is always around targets, budget challenges, focus on efficiencies, put productivity initiatives, and so on is a really good red flag. Are you starting to focus more on temporary fixes? Are you patching equipment? Are you stretching the life of equipment rather than investing or permanent solutions? Are you may be reacting to things rather than predicting and planning ahead? Now, organizations do lots of safety-related activities, and previous podcasts have talked about safety work and the work of safety. But if organizations start to see the completion of safety activities as being more important as to whether they’re effective, that’s quite often a big warning sign as well.

Companies are doing risk assessments, investigations, audits, and writing a safety case if that applies to your industry. And if the completion of that, if getting that done is more important than using it as a learning exercise and then whether it’s effective, that’s also a bit of a trigger for the organization. So, there are these things you can go looking for. I think one of the biggest things for me is because there are lots of questions we could ask, is that if you assume that your assessment of these major risks is incorrect and go proactively seeking information to continuously revise your assessment, you’re more likely to pick up these issues. Whereas if you assume that everything’s okay until it isn’t, it is too late at that point. Organizations are getting better in their maturity in their approach to investigations. But that maturity hasn’t carried over to being proactive in looking for issues. We’re getting better and better investigations, but we don’t want to have incidents to investigate. In organizations, there are tools or techniques. There are ways you can go and proactively look in your organization to find these issues. The maturity of investigations just hasn’t translated over to proactively going and looking for things.

There are lots of reasons why that might be the case.

I think it’s an interesting point because I think if you’ve got… The other element that comes to mind is if you’ve got an incident that happened, it’s clear who owns the investigation. But who owns this proactive view? Because in some organizations, it could be an audit, but an audit is not always necessarily equipped to do it. I know that in one organization, an audit made an audit in safety, and their focus in terms of driving safety improvement was to find ways to get employees back to the office faster, which has no impact on safety. But from a financial standpoint, if you don’t have expertise in what safety means, that might sound like a viable solution to reduce a rate, right? It could be your safety organization, but that safety organization needs to have the right visibility. It could be some form of a red team that’s constantly looking for challenging pieces. What have you seen be most effective in terms of where this resides and the practice around kicking the tire? Is that what you’ve got?

I think part of the issue there that I alluded to earlier on, Eric, is that I just don’t think this is a formal role within organizations. The departments that you mentioned quite often don’t have the expertise, experience, or time to be able to go and look for these issues proactively. So, the audits, investigations, they’re all quite constrained in their agenda, and so on. So, I don’t think there is a good example that I know of a function in an organization that is proactively going and looking at these areas. You do have risk committees and all these audit committees, whether or not you’re looking in the financial sector or whether or not you’re looking in oil and gas. I think there are pieces of the puzzle held by different people within an organization that can contribute to this review that we’re talking about. But I don’t think there’s really good practice out there of how that’s been pulled together into a cohesive, proactive, challenging go look to see whether or not we have any of these issues, particularly when you’re trying to learn from other industries. So if there’s been a big incident in one industry and there’s a big report that’s come out, and there are lessons and recommendations in that, organizations in that industry might look at that and might go and challenge themselves.

But that’s relatively short-lived, I think. If you ask people in organizations, what are the main failures in Piper Alpha? What were the main failures of Bearings Bank? What are the main failures in the shuttle incidents? A lot of people, including safety people, just can’t tell you what those organizational learnings would be. So not only are they not going looking for these things, but quite often, that experience, that understanding is just not available, Eric. But I think it’s a big gap. I think there’s a role for human factors, people, and systems people to be able to fulfill that role. But it’s very difficult for an organization to fund a position whose role it is, is to go looking for things that may or may not happen or that might be very unlikely to happen. In these times, it’s quite challenging to resource that position in an organization.

A couple of things that come to mind because I’ve seen some organizations do quite well at learning through case studies of others. So as a senior leadership team looking at something like the 737 MAX and what transpired around the box, looking at the Challenger, looking at Texas City, or looking at Deepwater Horizon, and using these as case studies to say, how could this happen here? And driving that reflection because then you’re starting to force this learning out of the industry and push that it could potentially happen here. And the other piece I’ve seen, and I think this is a… You talked about the human factors piece, I’ve seen some organizations that proactively, or maybe it’s every few years, run a safety culture assessment as an example. Now, my challenge with a lot of safety culture assessments is that people will do a survey which will give you no insights into what you’re talking about. But when I’m thinking about a robust one, you’re looking at surveying and speaking to a lot of employees to look about what could go wrong. And you also do a review of system factors. You look at a lot of the practices, the processes, the changes, the things that have occurred over the past few years.

So essentially, you’re kicking the tires on a regular basis at the organization. But what I’m talking about is it’s closer to really kicking the tires, but looking at the system components as well, even though the analysis, because the survey won’t be good enough.

I think you’re right. Organizations are doing surveys; they’re running focus groups. Some leaders will be doing walk-arounds. They’re going to facilities and talk to their staff. If prepared for that, that can be really, really helpful. They’re if you prepare them in terms of what they should ask, that can work quite well. I think these are all activities, and these are all tools that we have available, but I don’t think typically they are aimed at trying to pull out these deeper organizational issues, or maybe they’re not. The different sources of information maybe are not combined to give that overall view. Occasionally, organizations will get an independent organization in to do that review for them, which can be quite interesting. But again, that takes you back to the issue of you having to learn from those recommendations as well. And we have seen quite a few cases where independent contractors who’ve been asked to come in and review an organization quite often temper their findings because they want to get continual employment from that company. And we’ve seen that in some of the major financial events. But Bearings Bank is a good example where the auditors did not see issues, or when they saw issues, were not communicating them to the board because they didn’t want to alert the board to some of the issues that were there, which contributed to the demise of the bank.

So, there were lots of barriers and structural issues that might prevent some of the tools you suggested from working really effectively. But there are tools out there that can be used. We’re making general comments about what we’re seeing in the industry. It’s not to say that there are some organizations that are doing this well. I think it’d be really good to unpack those lessons in learning and communicate those more widely because there are pockets of good practice. I’m not saying no one’s doing anything at all here. There are pockets out there. We need to understand what they are, what is effective, and help to share those more widely for other organizations that maybe are not doing this proactively.

That’s often the tricky part because once something goes wrong, it makes front page news. The 37 MAX makes front page news, multiple investigations, lots of insights, lots of learnings. But does that mean that Airbus, on the other hand, that hasn’t had such a failure, is doing all of this proactively, you don’t necessarily know because they’re generally quieter about it. So, it could actually just be pure luck or actually good practices. And that’s the tricky part.

It could, but it could also be… If you look at an organization that’s had a few incidents or a couple of disasters, people might think, oh, well, actually X, Y, and Z is a bad company. It’s because of them. It’s the fundamental attribution error. If someone is driving poorly, you think it’s because they’re a bad driver. Whereas if you do something, if you cut someone up and so on, then you think, well, there’s all these other reasons why I did that. So, we tend to attribute failures to people because it’s an issue with them not thinking about all the contextual factors that influence behavior. So maybe that fundamental attribution error is something that’s important when we’re looking at these disasters because it’s easy to say, well, they’re just a bad company, and that won’t happen to us. We’re different. We employ different people. We’ve got all these processes and systems, and it won’t happen to us. Risk blindness is an issue for us as well.

I think if you touch briefly on Bearings Bank, the same symptoms that happen in Bearings Bank would probably have happened in many other locations because it’s not that hard to have a rogue trader. The difference there was the size of that rogue trader, but they’re present everywhere. Nab in Australia had three rogue traders on the FX side roughly around the same time. And there are lots of other examples that don’t get reported or get reported on the hundreds page of the newspaper if you really seek to look at them because it’s never a cause for success, but they happen a lot more often than we think.

I think they do. I think you’re right that we pick these examples, and we talk about these big disasters, partly because there’s so much information available on them. And it does become a little bit unfair that we keep going back to the same disasters, but they’re the ones on which we have much information. They’re the ones who’ve been investigated to the end of the degree. But you’re right, there are lots of other failures going on. Not all of them become so high profile. But we do know that lots of other organizations maybe have similar events, but they just, like you say, they don’t make the press for whatever reason, and they don’t become case studies on training courses for the next 30 years. But you’re right. You can pick Bearings Bank, and there would have been several of the banks with the same issues at the same time because they had the same processes or didn’t have those processes in place as Bearings Bank, but it just didn’t play out in the same way. As you know, maybe they had a huge loss, but it wasn’t enough to destroy the bank, and therefore it’s less visible to everybody else.

But you’re right, we’re picking a few case studies here because these are the ones, we have detail on. But it’s not to say this isn’t occurring much more widely than that.

So, Martin, thank you very much for joining me. I think a really interesting series of topics, the link that a lot of organizations relation feels for the same reasons. I think what’s really big takeaway is how do we learn better from investigations and then how do we learn proactively before anything ever occurs? How do we have that questioning attitude on an ongoing basis because it’s too easy to close your eyes and something and think, No, it’s okay? We’re okay. And really, how do you drive that questioning attitude within the business? So, Martin, these are really interesting topics. Obviously, your website, human factors101.com is an excellent source for insights. Is that the best way if somebody wants to reach you to get more insights?

Yes, certainly. I write quite a lot on that website, so you can go there and have a look. There’s a lot more information on there, or you can follow me on LinkedIn. If you search for Human Factors 101, you’ll find me there on LinkedIn. Please get in touch.

Excellent.

Thank you for listening to the Safety Guru on C-suite Radio. Leave a legacy, distinguish yourself from the pack, grow your success, capture the hearts and minds of your teams, and elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Martin Anderson has 30 years of experience in addressing human performance issues in complex organizations. Before joining an oil and gas company in Australia as Manager of Human Factors, he played a key role in developing human factors within the UK Health & Safety Executive (HSE), leading interventions on over 150 of the UK’s most complex major hazard facilities, both onshore and offshore. He has particular interests in organizational failures, safety leadership, and investigations. Martin has contributed to the strategic direction of international associations and co-authored international guidance on a range of human factors topics.

For more information: www.humanfactors101.com.

STAY CONNECTED

RELATED EPISODE

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo

Diving Deep: Navigating Organizational Learning through Storytelling with Gareth Lock

Deep Dive: Navigating Organizational Learning through Storytelling

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

Dive into another captivating conversation with us as Gareth Lock returns to The Safety Guru! Tune in as Gareth dives deep into navigating organizational learning through storytelling and discusses creating an environment of shared trust to encourage vulnerable and productive structured debriefs. Gareth’s profound insights and compelling examples will unveil the hidden layers of organizational growth. Ensure you don’t miss this insightful episode!

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost, for the C suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy’s success story begins now.

Hi, and welcome to The Safety Guru. Today I’m very excited to have back on our show Gareth Lock from the human diver. He’s an author. He’s brought a lot of experience from his 25 years in the Royal Air Force, to oil and gas, to many different industries, including diving. But my favorite is his branding around Counter-errorism. So, Gareth, welcome back to the show. Tell me a little bit first about Counter-errorism and your journey into safety and diving.

Eric, thanks very much for inviting me back in. As we know from the last time, there’s just so much to talk about this stuff, and I’m really quite passionate about sharing my knowledge and that journey that’s there. So, the whole piece about Counter-errorism in diving is just recognizing that we’re all fallible. My first idea about the diving business was the fallible diver. And people are like, That’s really negative. We know that humans are fallible, so why not make it a human diver? It’s like, Yeah, okay. And so, it’s both sides of what I would say that the bow tie that some of your listeners might know about is the prevention piece and then the mitigation afterward and recognizing that human error is normal, the first principle of human and organizational performance. I’ve got a really broad experience and operational background in aviation, research and development, flight trials, and procurement systems engineering. Left the Air Force in February ’15, set up my own business, and worked in oil and gas and health care and software teams. But my passion is really about trying to bring this stuff into the, predominantly the sports diving space, but now starting to work with military and scientific and commercial dive teams as well because people are people.

We’re all wired the same way, and we all behave broadly the same way. So, the knowledge is easily transportable. As long as you can have an open mind and say, you know what, that’s the context and the behaviors that lead to error outcomes, let’s see how we can bridge that into whatever space that I’m working in.

Excellent. And then today, a topic we’re going to touch on is organizational learning, something that is a very, very powerful and important concept that is really at the crux of safety, but more specifically around the power of storytelling when it pertains to learning. So, tell me a little bit about some of the work you’ve done around learning and listening to stories.

Yeah. So, one of the challenges in any environment is getting lessons to be transferred from one person to another. And the difference as well between lessons identified and lessons learned. People will experience something, they’ve gone wrong. They then need to take a little bit time to reflect and unpack what’s just happened. And there’s almost an altruistic need to share that story beyond yourself. Organizations or domains mandate or regulate reporting. So, aviation, there is an obligation that said, you had an event, you are to report. Now, actually, would it be nice if we could actually get people to share those stories voluntarily? They get that out there. And for that to happen, we’ve got to have both a psychologically safe environment, so we know that we can make those mistakes, but also, we’ve got to have a just culture that recognizes that we’re all fallible. And there is this gray line that sits between acceptable and unacceptable behavior. So, in the diving space, where my real interest in human factors and diving came from in 2005, where I had a near miss. Diving had a close call. I recovered from the situation. I got back to the UK, and I said, well, how do I report this?

Because that was my military aviation background, I had a near miss. Let’s share it. I found it really difficult to do that. So, since 2005 and now, really, it’s been about trying to create an environment where people can share stories and tell stories. I’m doing a Masters’s degree at Lund University, and one of the things that I’m looking at there is where people share stories. What are the barriers? What are the enablers? Who will they share with? Why won’t they share? And so, as I’ve gone through the literature, there is a couple of reasons. Organizations would like stories to be shared, and incident stories to be shared, because they believe that they, as an organization, can learn and improve. But for that to happen, the person who’s been involved in the story has to have some value to do that. Now, that value could be internal, so we unpack it. We got a cathartic approach to sit there and go, Wow, okay, that was close. What happened? What was the context? What led to that? Because actually, I don’t want that to happen again. But that’s potentially counter to what an organization wants, where they’re looking at much bigger things, or often they’re counting stories, and they’re not actually listening or reading the narratives that are there.

And so, there are two conflicts between storytelling following incidents. And that work from Santa in 2008 just looked at actually frontline railway engineers, operators, and trackside engineers. They tell stories to keep themselves and their buddies safe. My research in the diving space has shown that people share stories in a close, trusted group because they don’t want it to go further. Even though organizations talk about having psychological safety or a just culture in place, there’s often a fear that people will be ridiculous for being stupid. And if we can’t recognize and can’t accept fallibility, then the stories that get shared are not complete. So, it’s a huge opportunity, but we’ve got to create almost a theater to be able to tell those stories.

That is a very interesting point. And I know when you talk about stories, there was some research I was reading recently from Harvard around retention, and we retain stories considerably better than statistics. Difference at the end of the day in terms of what you do remember to the tune of 33 % versus 73 % of what you’re doing your members. So substantial differences. So how do you create that environment? How do you create this setting? So, what you describe in diving, to me, sounds like a group of buddies together, sharing maybe after work. And so, it’s more social learning, but it’s not necessarily embedded in the organization.

Absolutely. So how do you do it? You create an environment where people can share, where you have a structure of a debrief. So, in some of the original work from Gary Klein with Firefighters, how do they make decisions in uncertain environments? Time pressure, incomplete information. And what he noticed was that they would finish their shift, and they’d clean up their gear, and then they’d go and grab a brew, and they would talk about what they heard, what they smelled, what they felt, what was going through their mind. And that was as a team. And so, what was happening is they were sharing and creating shared mental models within their teams. And that then helped them make decisions in uncertainty. And it helps pass on tacit knowledge. So, the environment is critical. There has to be a level of trust. And you’ve got to have a norm of doing a debrief. And that’s what I’ve been trying to bring into the diving space, having a structure for a debrief because often people don’t know how to tell a story. And that’s, again, what’s come out of my research is that novice divers, especially, they’re lacking in two things.

One is they don’t know how to tell a learning story to get a point to cross. And the other thing is actually they often don’t know what they don’t know. So, it’s that bit that they don’t know they’ve had a near miss because they have got more concept of what right, wrong, good, bad looks like. And as a consequence, they’re not even looking at where things are. When we get to, I’m going to say, the more mature area of the diving space, we talk about instructors. Now we’ve got credibility, we’ve got the reputation, we’ve got litigation involved. And in that sense, instructors won’t tell their near-miss stories because there’s this fear of, oh, look, there’s an important instructor. Hang on a minute. I’m supposed to be doing some training with him, and he’s talking about mistakes that have happened. It’s like, Yeah, they’re human, too. That’s no different than surgeons. The society holds surgeons on a pedestal of excellence. Police officers operating in dynamic, uncertain environments. It’s really difficult to tell a multi-actor truthful story because people will be able to play the news clips back or the body cam stuff back and go, hey, look, you missed that, and you missed that because they don’t understand human fallibility.

So, this bit, how do you create an environment? It’s leaders, peers, role models that and you can start in small groups and build shared trust or psychological safety. But for a start, you’ve got to know where something has gone wrong. And I recently wrote about near misses, were you lucky, or were you good? But often, near misses are treated as successes rather than failures because we got a good outcome, even though we were really close. And so, we just move on, pat in the back, off you go. It takes a very different mindset to sit there and go and ask that question, were we lucky, or were we good? Oh, yeah, we were good. All right. What do we do that we can replicate the next time and the time after that? Oh, yeah. Actually, we were pretty lucky then. All right. So, let’s look at what we missed and build those stories and then share them as it goes. And the problem with stories is that they get modified and changed because of the way that our memory works. We embellish certain factors, and we hide other ones because we don’t have that side of psychological safety, that security to show our vulnerabilities.

Very interesting. When you mentioned you talk about storytelling debriefing, a scenario that comes to mind is the approach that the US Army has used around after-action reviews, which are originally intended to be essentially storytelling from multiple different perspectives to walk through. What do we go through, whether or not there was something good or bad as an outcome, but really trying to look at what we plan and where was it different than what we expected it to look like? Is that something similar what you’re describing?

Yeah, totally. This needs to get into the habit of running a debrief. So often, debriefs or after-action reviews are run when something has gone wrong. Now, if you don’t perceive that something has gone wrong, why are we running this debrief? And it just then loses its value, and people then lose get out of the habit of doing it. Whereas actually, if we frame the debrief and we can put something in the show notes, a link to a debriefing guide that I use, and it follows the word debrief. And so, the key learnings that are in there are internal learning. What did I do well and why? What do I need to improve on, and how am I going to do it? And the E is the external learning, the team. What did the team do well and why? And what did the team need to improve, and how are they going to do it? And the why and the how questions are the most important because we can make an observation about something that went well or we need to improve, but it takes a lot of thinking to say, why did that go well?

Or how are we going to make that improvement? And then the final part, the F of the debriefing framework, is about fixed files or follow-up. So, you’ve done an activity, you’ve briefed it, you planned it, you briefed it, you’ve done it, you’ve debriefed it. Now that you’ve identified some lessons, what are you going to do with them? And that’s the difference between lessons learned and lessons identified. Many organizations have got loads of lessons identified, but far fewer lessons learned. And the lessons you’ve learned are where you’ve looked at something, you’ve put something in place, and you’ve measured its improvement. Or actually, you realized that that intervention didn’t work, and so you’ve learned that that didn’t work. So, the difference between lessons learned and lessons identified is, did a change happen afterward? And that’s a huge piece.

It is because a lot of times, like you said, organizations learn the same thing over and over and over because the change is not embedded. It’s just something on a policy document that says thou shalt do it this way, which may or may not solve the problem or may or may not be operationalized.

Absolutely. And that takes strong leadership. I was recently involved in a major review, and the accountable individual, the duty holder for this, wouldn’t sign off the actions or the recommendations as being complete until they’d actually been completed and put in place. Because one of the parts of the review that we picked up was that there were recommendations made in previous reviews that never actually got fulfilled. And it was like, hang on a minute, these were not directly contributor entries towards the event, but they did recognize that hang on a minute, we’re not very good at learning here because we capture this stuff, and we don’t fix those things that are faulty or failed.

This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, free energize your BBS Program, enhance Supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit Us at propulo.com.

I love your storytelling approach to learning. How do you disseminate that across an organization so that the divers that get together, they can do that casually? How do you make sure that that same insight gets cascaded to groups that can’t be there physically?

So as a direct example, what I put together is a documentary called If Only. And that looks at a diving fatality through the lens of human factors and just culture. And I’ll send you the link for that as a human diver or slash, If only. And I was really fortunate to get involved with the widow of the diver and the dive team, three members of this, three surviving members of the dive team. So, we flew out to Hawaii, and we had a face-to-camera work, we re-enacted it, and we shot about five and a half hours of video. And then that was reduced to 24, 25 for 25 minutes, and then I added some other stuff. And the editor said, Look, you’re going to have to make it shorter than 20 minutes. I’m like, what do you take out? I don’t know. So, I created this 34-minute documentary which has been downloaded thousands of times. And that then goes out. And I know that people in the non-diving space have looked at this and gone because the failures are multiple within the system. And often, it’s about psychological safety, decisions, inability to speak up, drift, about equipment not being set up correctly, which carry across many other domains as well.

So, to me, the ability to share engaging, emotional, sometimes really quite powerful stories to get across there. So that’s one way. The blogs that I write, I often start a blog with a story because people… When you open it up, and you go, what’s going to happen next? You started off with, and the diver was on so and so, and this, and you go, Right, what’s happening next? And you’ve got to put a hook in there, and then you’ve got to stitch the theory into the story so that it becomes a learning lesson, and they can relate to the individual. There is a really powerful bias of distancing through differencing, and this sits not just at an individual level but an organizational level as well, where we will look at somebody or some organization and go, They’re different to us. We wouldn’t make that mistake. And you sit there and go, yes, you would. From the diving side, I put together under pressure the book that I published, and there’s another one called Close Calls, which is a similar story. Mine’s got theory woven in and out. Close Calls is just stories from names across the industry.

And people like to read them. The hard part is, does it actually change people’s behavior? Because ultimately, that’s what we want to do, get people to think differently and understand the context in which they were. Not to turn around and say, I wouldn’t do that as an outcome because the outcome is too late. What we’re trying to do is spot the context developing and sit there and go, oh, I recognize this, and I can see where the trajectory is. But that’s really hard to get across. And even when you’ve got known stories, so there’s a paper I read recently from Dylan and Tinsley, or might be just big Dylan on their own, talking about using lessons from Challenger to get the ideas across. And what they did was they created a scenario of an aircraft that needed to fly some spares to a remote location, but the temperature was low, and the oil seals might leak on the engines. And if the oil seals broke, they’d need to shut down the engine. They’d probably ditch, and then the crew might not survive the ditching. And what was really interesting was that even though the story was told as if it was Challenger, the people didn’t recognize it was Challenger.

And still, about 70 odd % of people went, Yeah, we’ll launch. Off you go. So even when you’re given a narrative, we often can’t make the connection because it’s just the way our brains are wired, unfortunately. So, it has to be really visceral. It has to be that’s me, and I would do that.

Interesting. And I’ve seen this many times in organizations. When you talk about small group sharing their mistakes, part of it is there is camaraderie, people know each other. Is there a way that you seem to extend this so that people don’t say that won’t happen to me? I wouldn’t make that silly mistake. To really overcome that element, to recognize that, yes, as humans, we’re all bound to make those mistakes.

So, I’d say probably US Forest Service with their lessons learned center that they’ve got. And I think the important bit is to get away from the individual’s this erroneous performance and look at the context and the error-producing conditions which are there. And that’s why I was referring to earlier understanding what goes into a good learning story is understanding what sets somebody up for failure in this scenario they’re in. Because, by definition, if we knew what the outcome of the event would be, we would have stopped it.

Sure.

So, this bit about, Right, think about all those bad things that are going to happen. Yeah, well, how am I going to spot them? I don’t know the significance of those. So, what we have to do then is actually, what can we tell in terms of the situation developing that I will encounter? And then sit there and go, this is the system or the situation changing. Okay, that’s a flag. Not, I won’t make that mistake. It’s, I’m now in a situation where I’m more likely to make a mistake. Can I raise my game? Is this something that’s a flag that says, look-out.

Interesting. So, move it away from the area itself to the context of the situation that people are in because then you’re more likely to relate, saying, that set of circumstances could happen to me as well.

Yeah, totally. And so, aviation moved from cockpit resource management to crew resource management, now threat and error management. So, there’s this expectation that the aircrew is competent to do what they need to do. We don’t need to train them more and more to do that. The threat and error management situation are. I’m potentially going into a busy airfield. The wind is marginal. Do I set up the opposite runway, ILS or approach systems, or the other frequencies? The weather forecast has got thunderstorms in the area or whatever it is. It’s a potentially confusing runway. Let’s think about how we set ourselves up for success, not failure because generally, that’s about sharing stories where you know what, the situation got away from people. So, can we get ahead of things and provide that flag that says, Whoa, that’s enough? And in the majority of high-risk industries, we have something called stop work authority. My simplistic view is that often, that’s a stop by an organization to say, I’m going to give you a card. If you think it’s unsafe, then hold this card up and stop the job. But most people don’t know that it’s all going horribly wrong until it’s gone wrong.

And then the organization says, why didn’t you stop the job? Because you could see it was there. And there are a whole bunch of social, technical reasons why people find it hard to say stop because there are goals that are around there. So, if we can start to say, Let’s look at the conditions that are around us, then that’s actually easier to raise a flag.

Yeah. And also helps people understand where I am entering dangerous territory. Your example about maybe this confusing runway. There have been some runways where there’s been more than one flight that almost landed not on the runway but landed on another airplane that was taxing. But you know which airports those are. So, you could be on high alert if you know, okay, I’m approaching San Francisco is one of them, I believe, has come up a few times and say, okay, on this approach, here’s what I need to pay extra attention to.

Yes. And so, we’ve got a limited capacity to pay attention. So, in that bit that says, actually, here’s the high-threat situation. I’m now going to not quite ignore the other things, but I’m going to point my attention. And one of the things I try to get across in my training is we’ve got a limited capacity to pay attention. So, it’s not that people weren’t paying attention because often the response is, Pay more. We can’t pay more attention. What we can do is focus it somewhere else. So, what we’re trying to do is, what’s the threat that we’re encountering? And that comes from understanding the near misses that are out there and the context that’s encountered.

So, Rich, topic. To me, organizational learning is probably one of the most challenging parts of safety that we keep talking about. Hardest one to do. But I love your angle in terms of sharing stories, trying to learn on a regular, continuous basis, just so that people reflect and think through the stories. And then how do you disseminate those stories through scenarios on the context as opposed to the individual and the error that they made? I think these are very powerful concepts that hopefully help organizations move from learning the same thing over and over to learning and actually embedding that change. 

Totally. And what I would say from my experience as well as people are more likely to share a context-rich story than a closed narrative story which is focused on the individual. So, if you can get more context, more system if you can get multi actors in there, there’s a paper out there looking at when an incident report has got multiple narratives, then people are more likely to look at systems causes than a single narrative which is a synthesis by the investigator who will have their own perspective. And often, it’s about compliance, noncompliance. And so, people will look at that and say, here are the recommendations which are focused on fixing the person. Whereas actually, if you have multiple actors and you can hear the conflict and the different ideas, and when you’ve got six actors involved in an incident, expect six stories. It’s not because they’re lying; it’s because they’ve got different perceptions about what happened. So, if you’ve got the opportunity to share a multi-actor story, that’s the way to go about it.

So soon, we’ll be writing Hollywood scripts through those stories.

Well, we often have multiple actors in a story in a film.

But there’s some truth to the way you share stories because even in Hollywood, they say there are seven-story themes to every movie that’s sold across the board. Rags to Rich is an example. But it’s a narrative that we tend to listen to. The personas and everything else get us to associate with it and then remember that story.

Totally. And there’s a paper from Drew Ray which talks about the different safety stories and how you share them. Do you tell the outcome and then build it up on a different narrative? Do you tell one narrative where people jump to conclusions, and then then you tell the context-rich story, which then brings the learning point out? So, this goes back to what’s the purpose of the story and who’s the audience you’re trying to tell the story to, and the learning point you’re trying to get across.

Excellent. Well, Gareth, thank you very much for coming back to our show. Appreciate you sharing some of your thoughts about learning, organizational learning, and storytelling. I think it’s very powerful. Sayers of ideas to take forward. Thank you.

Brilliant. Thank you very much, Eric. I love being on it again. Thank you.

Thank you for listening to the Safety Guru on C-suite Radio. Leave a legacy. Distinguish yourself from the pack and grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Gareth Lock is the founder of The Human Diver, an organization set up to deliver education and research into the role and benefit of applying human factors, non-technical skills, psychological safety, and ‘just culture’ in sports, military, and scientific diving. He has published the book ‘Under Pressure’ and produced the documentary ‘If Only…,’ both focused on improving diving safety and performance by looking at incidents through the lens of human factors. While primarily focused on diving, he also works in other high-risk, high-uncertainty domains such as healthcare, oil & gas, maritime, and software. He is currently undertaking an MSc in HF and System Safety at Lund University where he is looking at the power (and limitations) of storytelling to improve learning.

For more information: https://www.thehumandiver.com/

The Debrief Guide: www.thehumandiver.com/debrief

If Only: www.thehumandiver.com/ifonly

Sanne (Santa in the transcript) – Incident reporting or storytelling? Competing schemes in a safety-critical and hazardous work setting – http://dx.doi.org/10.1016/j.ssci.2007.06.024

Klein and firefighters – Naturalistic Decision Making http://journals.sagepub.com/doi/10.1518/001872008X288385

STAY CONNECTED

RELATED EPISODE

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo

Deep Dive into Organizational Learning and Safety Culture with Gareth Lock

Deep Dive into Organizational Learning and Safety Culture

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

“The beauty of human factors is that it’s applicable in every space. It’s just the stories that change.” In this episode, we’re excited to have Gareth Lock take us on a deep dive into organizational learning, decision-making, and safety culture through the lens of human factors. Tune in as Gareth shares practical advice for creating a shared mental model within an organization through prioritizing psychological safety and how to effectively foster a culture of embedded learning and growth.

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and wellbeing of their people first. Great companies, ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost. For the C-suite, it’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski, a globally recognized Ops safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi, and welcome to the Safety Guru. Today I’m very excited to have with me Gareth Lock, who is the founder of The Human Diver with ex-military aviator who’s taken his operational experience into diving and safety. Gareth, you have a very exciting and interesting story and background, so why don’t you start us there?

Excellent. Thanks Eric, for the invite on to here. So yes, it’s quite a diverse background. So, I spent just over 25 years in the Royal Air Force as a Hercules navigator, transport, aircraft, teaching and operating both low level, high level operational environments. I then went into flight trials, then did some research and development work, like working for an organisation like DARPA, then into systems engineering and procurement. So, I’ve got a very broad view of how systems work and then come 2015 decided I was going to leave the Air Force and set up my own consultancy, which was about bringing crew Resource Management nontechnical skills into high-risk environments. Crew Resource Management is just part and parcel of how military aviation operates. And so, I’ve been a diver since 20 19 99 Is certified and then got back into it in about 2005. And I’ve been trying to bring this view of safety and operational concepts into the diving world. So, in 2016, I set up the human diver. And the goal of that was really to bring crew source management, nontechnical skills, just culture, psychological safety, all the stuff that creates safety or influences safety into the diving space.

So since then, I’ve written a book, put a documentary together, trained probably about 500 people face to face around the globe and about two and a half thousand people online through face to face and online self-paced learning programs. And the interesting thing is people take the materials that I’ve written, the book that I’ve written, under pressure, they’ve gone. This is not a diving book. It’s like no, I know. And that’s the beauty of human factors, is that it’s applicable in every space. It’s just the stories that change. Individuals behave broadly the same way; organizations behave broadly the same way. So why can’t you take stuff from as a general thing from aviation or oil and gas and healthcare and move them into other spaces? And the biggest barrier is that doesn’t apply to me because I’m not in that space and it’s a known bias that’s there.

So, you touched on briefly CRM, which is very common, as you mentioned, in the Air Force, in civil aviation as well. Tell me a little bit more about CRM and how you think it applies to a lot of organizations.

Yeah. So, CRM is now known as Crew Resource Management. It used to be known as Cockpit Resource Management, and it came about from a number of seminal events in aviation, like Tenerife Kegworth, Manchester, where the analysis of flight deck recorders recognized that actually the crew knew that there were things not quite going right, but they were unable to speak up and challenge what’s going on. And it wasn’t until later events that they realized that actually, the back-end crew, the cabin crew, they also had a part to play in building this shared mental model. So, it then became Crew Resource Management. And what that? It started off as communication and assertion skills. Where I’m taking it personally and where it should be is about creating this shared mental model within an operational team. So that could be a flight deck crew plus the cabin crew. It could be on an oil rig where I’ve done CRM work before. Well, you’ve got the drill crew. In a normal business, even if it’s a high-risk business, you will have different perspectives about what’s going on. You’ve got the senior leadership, the middle management, the front-line supervisors, and the operators.

Each one of them will have a different perspective about what’s going on. And the purpose of CRM is to try and align those views as best they can. They will always be different because they’re all have different perspectives. But that’s also part of CRM is the fact that the front-line workers recognize that the senior management have got a different set of problems to solve. They don’t understand what we do. Well, that’s not their job to. But the purpose of this CRM is to share these interlinking circles, like a Venn diagram, that there will be a thread that overlaps. And so, the purpose there of CRM is to increase the overlap. So, we’ve got shared knowledge, but not make it so overlap that we end up with group think and nobody’s thinking outside the box or the circle.

Right. So, you touched on when you were talking about this, you talked to shared mental model. Tell me a little bit more about how that applies to an organization and how do you build it?

Yeah, so shared mental models, the world goes around as our decision making is based on these mental models, approximations of how things will operate. And as we build experience, we gain knowledge, we start to populate that model. And the research shows that the more models we have, the more accurate our decisions can be because we’ve got better, more realistic patterns to match that are there. Now, how that happens in an organization is that it’s done at multiple levels. So, you could have something like a small team debrief an after-action Review, which is about sharing a very local story about how that last event worked and not just about where things went wrong, which is often where the focus is on debriefs. What went wrong? Nothing. Well, what’s the point of running a debrief? But actually, the After-Action Review is about understanding how things went and how do we improve. Then you can start to grow those, and you can get I mean, the US forest Service has got some great resources in this, looking at facilitated learning analysis, where you start stepping up to a bigger group, a bigger team, and then you’ve got something as large as a learning review, where you’re bringing in multiple subject matter experts.

And the purpose of those learning reviews and to facilitate learning analyses is to bring multiple perspectives, conflicting perspectives. And you’re never going to get a unique line that says, and this is what happened, because and that’s uncomfortable for businesses because they want to have one truth. Well, there is no one truth. Each level within the organization will have some interactions and relationships which shape how they view the world. So, organizations need to create an environment where the bad news can be shared, where we can have constructive dissent, where we can undertake these intelligent failures. As Amy Edmondson talks about that we go out there and innovate and expect that okay, failure is okay as long as it’s not catastrophic, because the catastrophic basically means that we didn’t pick a whole bunch of other minor failures up and we’re hiding those.

So, when you mention shared mental model, you bring a lot of examples about organizational learning, which predefined that we’ve had some events that we’re learning from, which any organization does. But is there something that can be done at the front end as you’re coming to start implementing something to define a shared mental model within the organization?

Well, I’ll start off with saying, look, we, we are a learning organization. That means that we’re going to make mistakes.

Sure.

And you know, Timothy Clark talks about the four stages of psychological safety of inclusion learner safety, contributor safety, and, and Challenger safety. And organizations want to have this Challenger safety that the people speak up when things aren’t going right. So, you don’t have to have an accident, but you want to have people challenge what’s going on. But unless you feel included and you feel that actually you can make a mistake, then actually you’re never going to get to the Challenger space. So how do leaders create that environment? That’s about talking about the issues they face. It’s about opening themselves up and saying, you know what, I don’t have the answers and here’s some mistakes that I’ve made. And actually, they are going to model that vulnerability so that people are able to speak up and there are a whole bunch of things that people can do. So, if you talk about mental model as being a culture frame of understanding how this works? Absolutely. You can have a learning culture created within an organization and when people bring ideas to you, awesome. Explore them. That might be they don’t work, that’s fine, but go back to them and say it doesn’t work because of X, Y and Z or yes, let’s give it a go and if we fail, we fail.

It’s not a problem other than there might be some resource, but at the same time you might find some amazing stuff in the heads of the people. And that links me just to something that sort of triggered a thought when you said about organizational learning. Organizations don’t learn. Organizations have memories that are created by individuals within the organization. So, it’s about how do you get the knowledge out of those individuals and share them. And there’s some great work by Dave Snowden talking about the challenges of doing that. Because if you have a common understanding, a common vocabulary set, a shared mental model of what stuff looks like, then actually you don’t have to spend quite so long explaining something to somebody else. But if you go to somebody who’s got no idea about what’s going on, you’ve got to spend time building a framework in which you can start hanging ideas off. Because if you give somebody a whole bunch of ideas and they’re not able to abstract it or convert it into their own mindset or experiences, it’ll just go whistling past and it won’t make sense. So, it often does depend on the audience that you’re talking to and what do they know about stuff.

And it might be you’ve got to tell a whole bunch of different stories, analogies, bring those metaphors in so people can make that bridge. So, it’s not an easy thing to do. I get that it requires investment and that’s often a bit that organizations don’t follow through because they don’t see the value in the learning.

Right. So, what are some of the ways that you’ve helped instill organizational learning? As you said, it’s really the collective memories. You talked about after action reviews, you talked about learning reviews, which are very much highly interactive, team-based reflections on what I was setting to do, what occurred, what can we take away from it which is positive and negative? If you won a battle, you want to know what did you do well? And if something didn’t go as well. So, it’s not just post-mortem, as some people call them, that says basically everything that went wrong. It’s very much constructive being part of many of them. So, tell me about some of the other tactics an organization that wants to embrace more deeper learning can take.

This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions. Propulo has you covered. Visit us at propulo.com.

So, one of the first things that I often do is run through concepts of nontechnical skills, about how you create this shared mental model and the fact that it’s made up of situation awareness, decision making, communications leadership, teamwork, performance shaping factors, that these are interdependent skills. And I’ll go through some workshops. I use computer-based simulations. I get people to fail in a non-professional, jeopardizing way. So, the simulations are used. They’re about flying prototype spacecraft. Nobody can bring any prior knowledge. We can mess around with team dynamics. And so, people who are normally following, they will now lead, and the leaders are now following. And often it’s a great way of showing leaders what it’s like not to have a voice because there might be some equipment failure, which means they can’t talk. And they’re now sat there, and you can see them being really frustrated because they can see a train wreck arriving in front of them, but they can’t say anything. And so, you say, what do you think it’s like to be a follower then, when you don’t have a voice? So that’s what it’s like. So, making it as experiential as possible, making it as unthreatening in a professional context as possible, digging into details and using a structured debris format, which is transportable across any sort of domain.

But it’s looking about creating psychological safety. It’s about learning from what went well and why and what do we need to improve and how. And out of those four questions, the why and the how and the most important observations are easy. Oh yeah, we saw that, we did this, blah, blah, blah. Okay, so why did he go well? I’ve got to think about this and how are we going to make the improvement? It’s not enough to say, yeah, yeah, we won’t do that. Okay, do you understand why you failed when that happened or the improvement that’s needed? And do you know how you’re going to address that? Because if you don’t, all you’ve done is you’ve created a lesson identified. You haven’t done a lesson learned. And that’s a bigger piece as well, is that lessons are not learned until you have identified the thing, put something in place, and measured its effect, because otherwise it’s just a lesson identified. And so, you go into organizations, and you say, we’ve got a lesson learned book. Oh, yeah, we got one of those. We’ll get one at the end of the project. We’ll do a sort of post-mortem.

Who looks at it before you run a project? Oh, nobody looks at it. Right. So, what you’re doing is you’re collecting a whole bunch of data that nobody’s using and you’re not actually feeding forward into the next program, project or whatever to see whether or not it changes that might not it doesn’t work. Well, that’s a lesson learnt too, that intervention didn’t work in that space. Okay, why? Let’s look at these things. So learning is a continual process that requires you to take stuff in the past, match with what you’ve got, project into the future, have a look. Not that in work, right? We learnt something and then move on its. It’s not just collecting stuff at the end of a project in a wash up and say, right, stick it in the register book.

So, an analogy I use often in the safety space, I talk about learning and then embedding of the learning. It’s essentially the same thing because at the end of the day, you haven’t learned anything if you haven’t actually embedded it is there’s a lot of great learnings that come in from events, they get communicated, shared, and then people forget about it and the same event continues to happen. And so, the embedding part is about change. Management is making sure that we check so one is validated, is this the right correction? But in some cases, it could be that the correction isn’t being adopted, followed as an embedding piece. Because if you want a thousand pilots to do the same thing tomorrow, a Bolton won’t necessarily change the behavior.

Absolutely. And the other thing to bear in mind is the number of stories that happen at the sharp end and why those stories are told. And there’s a piece that I’ve just finished reading as part of my studies, just looking at why those stories don’t get told up higher. And it’s often because the front-line operators don’t understand the organizational influence of accidents. So, when they report something, an incident, they look at very proximal social bits at the sharp end and they don’t understand that the genesis is often further up. So, they don’t see the value in sharing. And if they do share, they don’t necessarily draw the analysis and the investigation process often just focuses on fixing the worker when they’re inheriting failures that are within the system. And it’s about how do you best prepare those workers to finish the design? Because those workers always finish the design of the paperwork. The paperwork is never complete, and it can never be complete. So, it’s this bit of how do we close those gaps?

So, touch on another area that you touched. When you went and talked about CRM, you talked about decision making, you talked about communication. There’s a big part of CRM which is how do I make the decisions? And I know you do a lot of work around organizational decision making. Can you enlighten us with some thoughts and insights on that space?

Yeah, organizational decision making is really going to be influenced by whatever the drivers and the goals and the culture within the organization is. So, this bit about safety is our number one priority. Rubbish. It’s about making profit. So, if you want to create that change in terms of safety decisions? How does it align with the bigger picture that’s out there? And there’s some tools out there and I’ll make a big shout out to the guys at Red Team thinking for the way that they manage a structured constructive dissent program. So, looking at the assumptions, formally validating those processes, you’ve got a strategy document that says, this is how we’re going to do something, or this is what we’re going to do going forward. That document will have lots and lots of assumptions in it. Some of them are explicit and some of them are implied. So, going through those and saying, right, what are those assumptions? How do we know that we can validate those? And what happens if those validations are false? And there are a bunch of tools that you can do that, but the way that most of our decisions made, even at the organizational level, will be done through emotional processes rather than logical.

What we would talk about decision making tools like Toddler, which came from British Airways of Time. Diagnose options. Decide, assign, review. That’s a system two thinking process. Very rarely do people go through that and understand the biases that they’re in because they know what the goal is, right, we’re going to do that. And they’ll look for so much evidence to reinforce their thought process and their path, rather than looking for disconformity evidence and say, why is this a rubbish idea? What can go wrong? And one of those tools is a pre mortem. And that’s a great way of talking about failure that has happened. And you dig into the emotion that people are happy to share stories of failure as long as it’s in the past, but they’re not quite so happy to share stories that might fail on them. So, a facilitator creating an environment that tells a story that says the failure has happened, you’ve now got two minutes to write down all of the answers as to why that thing failed. And because you compress time, people just throw stuff on the paper and then you can go around in a structured way to explore those ideas and then say, have we got this on our risk register?

No. Okay. And it’s a great way of dealing with the emotions we have and exploiting them in a positive way.

Makes sense. So, a lot of very rich topics we touched on CRM, we talked about organizational learning, we talked about decision making. If somebody wants to get in touch with you, Gareth, and get more insights on all of these very rich topics, how can they go about doing it?

So, my website is thehumandiver.com now. It is primarily diving focused, but as I said right at the start, this is just anything that’s out there or [email protected] is the best email address for me. And you can find me on LinkedIn as well, posting pretty much every day and a whole bunch of useful stuff.

And as you said, this is not just about diving. This is about leadership. This is about being safe and organizational decision making.

Absolutely.

Thank you so much for joining us.

Thank you, Eric. I really appreciate the invite.

Definitely. Thank you.

Thank you for listening to the Safety Guru on C-suite Radio. Leave a legacy. Distinguish yourself from the pack. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo consulting.

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Gareth Lock is the founder of The Human Diver, an organisation set up to deliver education and research into the role and benefit of applying human factors, non-technical skills, psychological safety, and ‘just culture’ in sports, military, and scientific diving. He has published the book ‘Under Pressure’ and produced the documentary ‘If only…,’ both focused on improving diving safety and performance by looking at incidents through the lens of human factors. While primarily focused on diving, he also works in other high-risk, high-uncertainty domains such as healthcare, oil & gas, maritime, and software. He is currently undertaking a MSc in HF and System Safety at Lund University where he is looking at the power (and limitations) of storytelling to improve learning.

For more information: https://www.thehumandiver.com/

Book: www.thehumandiver.com/underpressure

Documentary (including workshop guide): www.thehumandiver.com/ifonly

STAY CONNECTED

RELATED EPISODE

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo