LISTEN TO THE EPISODE:
ABOUT THE EPISODE
“The beauty of human factors is that it’s applicable in every space. It’s just the stories that change.” In this episode, we’re excited to have Gareth Lock take us on a deep dive into organizational learning, decision-making, and safety culture through the lens of human factors. Tune in as Gareth shares practical advice for creating a shared mental model within an organization through prioritizing psychological safety and how to effectively foster a culture of embedded learning and growth.
READ THIS EPISODE
Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and wellbeing of their people first. Great companies, ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost. For the C-suite, it’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski, a globally recognized Ops safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.
Hi, and welcome to the Safety Guru. Today I’m very excited to have with me Gareth Lock, who is the founder of The Human Diver with ex-military aviator who’s taken his operational experience into diving and safety. Gareth, you have a very exciting and interesting story and background, so why don’t you start us there?
Excellent. Thanks Eric, for the invite on to here. So yes, it’s quite a diverse background. So, I spent just over 25 years in the Royal Air Force as a Hercules navigator, transport, aircraft, teaching and operating both low level, high level operational environments. I then went into flight trials, then did some research and development work, like working for an organisation like DARPA, then into systems engineering and procurement. So, I’ve got a very broad view of how systems work and then come 2015 decided I was going to leave the Air Force and set up my own consultancy, which was about bringing crew Resource Management nontechnical skills into high-risk environments. Crew Resource Management is just part and parcel of how military aviation operates. And so, I’ve been a diver since 20 19 99 Is certified and then got back into it in about 2005. And I’ve been trying to bring this view of safety and operational concepts into the diving world. So, in 2016, I set up the human diver. And the goal of that was really to bring crew source management, nontechnical skills, just culture, psychological safety, all the stuff that creates safety or influences safety into the diving space.
So since then, I’ve written a book, put a documentary together, trained probably about 500 people face to face around the globe and about two and a half thousand people online through face to face and online self-paced learning programs. And the interesting thing is people take the materials that I’ve written, the book that I’ve written, under pressure, they’ve gone. This is not a diving book. It’s like no, I know. And that’s the beauty of human factors, is that it’s applicable in every space. It’s just the stories that change. Individuals behave broadly the same way; organizations behave broadly the same way. So why can’t you take stuff from as a general thing from aviation or oil and gas and healthcare and move them into other spaces? And the biggest barrier is that doesn’t apply to me because I’m not in that space and it’s a known bias that’s there.
So, you touched on briefly CRM, which is very common, as you mentioned, in the Air Force, in civil aviation as well. Tell me a little bit more about CRM and how you think it applies to a lot of organizations.
Yeah. So, CRM is now known as Crew Resource Management. It used to be known as Cockpit Resource Management, and it came about from a number of seminal events in aviation, like Tenerife Kegworth, Manchester, where the analysis of flight deck recorders recognized that actually the crew knew that there were things not quite going right, but they were unable to speak up and challenge what’s going on. And it wasn’t until later events that they realized that actually, the back-end crew, the cabin crew, they also had a part to play in building this shared mental model. So, it then became Crew Resource Management. And what that? It started off as communication and assertion skills. Where I’m taking it personally and where it should be is about creating this shared mental model within an operational team. So that could be a flight deck crew plus the cabin crew. It could be on an oil rig where I’ve done CRM work before. Well, you’ve got the drill crew. In a normal business, even if it’s a high-risk business, you will have different perspectives about what’s going on. You’ve got the senior leadership, the middle management, the front-line supervisors, and the operators.
Each one of them will have a different perspective about what’s going on. And the purpose of CRM is to try and align those views as best they can. They will always be different because they’re all have different perspectives. But that’s also part of CRM is the fact that the front-line workers recognize that the senior management have got a different set of problems to solve. They don’t understand what we do. Well, that’s not their job to. But the purpose of this CRM is to share these interlinking circles, like a Venn diagram, that there will be a thread that overlaps. And so, the purpose there of CRM is to increase the overlap. So, we’ve got shared knowledge, but not make it so overlap that we end up with group think and nobody’s thinking outside the box or the circle.
Right. So, you touched on when you were talking about this, you talked to shared mental model. Tell me a little bit more about how that applies to an organization and how do you build it?
Yeah, so shared mental models, the world goes around as our decision making is based on these mental models, approximations of how things will operate. And as we build experience, we gain knowledge, we start to populate that model. And the research shows that the more models we have, the more accurate our decisions can be because we’ve got better, more realistic patterns to match that are there. Now, how that happens in an organization is that it’s done at multiple levels. So, you could have something like a small team debrief an after-action Review, which is about sharing a very local story about how that last event worked and not just about where things went wrong, which is often where the focus is on debriefs. What went wrong? Nothing. Well, what’s the point of running a debrief? But actually, the After-Action Review is about understanding how things went and how do we improve. Then you can start to grow those, and you can get I mean, the US forest Service has got some great resources in this, looking at facilitated learning analysis, where you start stepping up to a bigger group, a bigger team, and then you’ve got something as large as a learning review, where you’re bringing in multiple subject matter experts.
And the purpose of those learning reviews and to facilitate learning analyses is to bring multiple perspectives, conflicting perspectives. And you’re never going to get a unique line that says, and this is what happened, because and that’s uncomfortable for businesses because they want to have one truth. Well, there is no one truth. Each level within the organization will have some interactions and relationships which shape how they view the world. So, organizations need to create an environment where the bad news can be shared, where we can have constructive dissent, where we can undertake these intelligent failures. As Amy Edmondson talks about that we go out there and innovate and expect that okay, failure is okay as long as it’s not catastrophic, because the catastrophic basically means that we didn’t pick a whole bunch of other minor failures up and we’re hiding those.
So, when you mention shared mental model, you bring a lot of examples about organizational learning, which predefined that we’ve had some events that we’re learning from, which any organization does. But is there something that can be done at the front end as you’re coming to start implementing something to define a shared mental model within the organization?
Well, I’ll start off with saying, look, we, we are a learning organization. That means that we’re going to make mistakes.
Sure.
And you know, Timothy Clark talks about the four stages of psychological safety of inclusion learner safety, contributor safety, and, and Challenger safety. And organizations want to have this Challenger safety that the people speak up when things aren’t going right. So, you don’t have to have an accident, but you want to have people challenge what’s going on. But unless you feel included and you feel that actually you can make a mistake, then actually you’re never going to get to the Challenger space. So how do leaders create that environment? That’s about talking about the issues they face. It’s about opening themselves up and saying, you know what, I don’t have the answers and here’s some mistakes that I’ve made. And actually, they are going to model that vulnerability so that people are able to speak up and there are a whole bunch of things that people can do. So, if you talk about mental model as being a culture frame of understanding how this works? Absolutely. You can have a learning culture created within an organization and when people bring ideas to you, awesome. Explore them. That might be they don’t work, that’s fine, but go back to them and say it doesn’t work because of X, Y and Z or yes, let’s give it a go and if we fail, we fail.
It’s not a problem other than there might be some resource, but at the same time you might find some amazing stuff in the heads of the people. And that links me just to something that sort of triggered a thought when you said about organizational learning. Organizations don’t learn. Organizations have memories that are created by individuals within the organization. So, it’s about how do you get the knowledge out of those individuals and share them. And there’s some great work by Dave Snowden talking about the challenges of doing that. Because if you have a common understanding, a common vocabulary set, a shared mental model of what stuff looks like, then actually you don’t have to spend quite so long explaining something to somebody else. But if you go to somebody who’s got no idea about what’s going on, you’ve got to spend time building a framework in which you can start hanging ideas off. Because if you give somebody a whole bunch of ideas and they’re not able to abstract it or convert it into their own mindset or experiences, it’ll just go whistling past and it won’t make sense. So, it often does depend on the audience that you’re talking to and what do they know about stuff.
And it might be you’ve got to tell a whole bunch of different stories, analogies, bring those metaphors in so people can make that bridge. So, it’s not an easy thing to do. I get that it requires investment and that’s often a bit that organizations don’t follow through because they don’t see the value in the learning.
Right. So, what are some of the ways that you’ve helped instill organizational learning? As you said, it’s really the collective memories. You talked about after action reviews, you talked about learning reviews, which are very much highly interactive, team-based reflections on what I was setting to do, what occurred, what can we take away from it which is positive and negative? If you won a battle, you want to know what did you do well? And if something didn’t go as well. So, it’s not just post-mortem, as some people call them, that says basically everything that went wrong. It’s very much constructive being part of many of them. So, tell me about some of the other tactics an organization that wants to embrace more deeper learning can take.
This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions. Propulo has you covered. Visit us at propulo.com.
So, one of the first things that I often do is run through concepts of nontechnical skills, about how you create this shared mental model and the fact that it’s made up of situation awareness, decision making, communications leadership, teamwork, performance shaping factors, that these are interdependent skills. And I’ll go through some workshops. I use computer-based simulations. I get people to fail in a non-professional, jeopardizing way. So, the simulations are used. They’re about flying prototype spacecraft. Nobody can bring any prior knowledge. We can mess around with team dynamics. And so, people who are normally following, they will now lead, and the leaders are now following. And often it’s a great way of showing leaders what it’s like not to have a voice because there might be some equipment failure, which means they can’t talk. And they’re now sat there, and you can see them being really frustrated because they can see a train wreck arriving in front of them, but they can’t say anything. And so, you say, what do you think it’s like to be a follower then, when you don’t have a voice? So that’s what it’s like. So, making it as experiential as possible, making it as unthreatening in a professional context as possible, digging into details and using a structured debris format, which is transportable across any sort of domain.
But it’s looking about creating psychological safety. It’s about learning from what went well and why and what do we need to improve and how. And out of those four questions, the why and the how and the most important observations are easy. Oh yeah, we saw that, we did this, blah, blah, blah. Okay, so why did he go well? I’ve got to think about this and how are we going to make the improvement? It’s not enough to say, yeah, yeah, we won’t do that. Okay, do you understand why you failed when that happened or the improvement that’s needed? And do you know how you’re going to address that? Because if you don’t, all you’ve done is you’ve created a lesson identified. You haven’t done a lesson learned. And that’s a bigger piece as well, is that lessons are not learned until you have identified the thing, put something in place, and measured its effect, because otherwise it’s just a lesson identified. And so, you go into organizations, and you say, we’ve got a lesson learned book. Oh, yeah, we got one of those. We’ll get one at the end of the project. We’ll do a sort of post-mortem.
Who looks at it before you run a project? Oh, nobody looks at it. Right. So, what you’re doing is you’re collecting a whole bunch of data that nobody’s using and you’re not actually feeding forward into the next program, project or whatever to see whether or not it changes that might not it doesn’t work. Well, that’s a lesson learnt too, that intervention didn’t work in that space. Okay, why? Let’s look at these things. So learning is a continual process that requires you to take stuff in the past, match with what you’ve got, project into the future, have a look. Not that in work, right? We learnt something and then move on its. It’s not just collecting stuff at the end of a project in a wash up and say, right, stick it in the register book.
So, an analogy I use often in the safety space, I talk about learning and then embedding of the learning. It’s essentially the same thing because at the end of the day, you haven’t learned anything if you haven’t actually embedded it is there’s a lot of great learnings that come in from events, they get communicated, shared, and then people forget about it and the same event continues to happen. And so, the embedding part is about change. Management is making sure that we check so one is validated, is this the right correction? But in some cases, it could be that the correction isn’t being adopted, followed as an embedding piece. Because if you want a thousand pilots to do the same thing tomorrow, a Bolton won’t necessarily change the behavior.
Absolutely. And the other thing to bear in mind is the number of stories that happen at the sharp end and why those stories are told. And there’s a piece that I’ve just finished reading as part of my studies, just looking at why those stories don’t get told up higher. And it’s often because the front-line operators don’t understand the organizational influence of accidents. So, when they report something, an incident, they look at very proximal social bits at the sharp end and they don’t understand that the genesis is often further up. So, they don’t see the value in sharing. And if they do share, they don’t necessarily draw the analysis and the investigation process often just focuses on fixing the worker when they’re inheriting failures that are within the system. And it’s about how do you best prepare those workers to finish the design? Because those workers always finish the design of the paperwork. The paperwork is never complete, and it can never be complete. So, it’s this bit of how do we close those gaps?
So, touch on another area that you touched. When you went and talked about CRM, you talked about decision making, you talked about communication. There’s a big part of CRM which is how do I make the decisions? And I know you do a lot of work around organizational decision making. Can you enlighten us with some thoughts and insights on that space?
Yeah, organizational decision making is really going to be influenced by whatever the drivers and the goals and the culture within the organization is. So, this bit about safety is our number one priority. Rubbish. It’s about making profit. So, if you want to create that change in terms of safety decisions? How does it align with the bigger picture that’s out there? And there’s some tools out there and I’ll make a big shout out to the guys at Red Team thinking for the way that they manage a structured constructive dissent program. So, looking at the assumptions, formally validating those processes, you’ve got a strategy document that says, this is how we’re going to do something, or this is what we’re going to do going forward. That document will have lots and lots of assumptions in it. Some of them are explicit and some of them are implied. So, going through those and saying, right, what are those assumptions? How do we know that we can validate those? And what happens if those validations are false? And there are a bunch of tools that you can do that, but the way that most of our decisions made, even at the organizational level, will be done through emotional processes rather than logical.
What we would talk about decision making tools like Toddler, which came from British Airways of Time. Diagnose options. Decide, assign, review. That’s a system two thinking process. Very rarely do people go through that and understand the biases that they’re in because they know what the goal is, right, we’re going to do that. And they’ll look for so much evidence to reinforce their thought process and their path, rather than looking for disconformity evidence and say, why is this a rubbish idea? What can go wrong? And one of those tools is a pre mortem. And that’s a great way of talking about failure that has happened. And you dig into the emotion that people are happy to share stories of failure as long as it’s in the past, but they’re not quite so happy to share stories that might fail on them. So, a facilitator creating an environment that tells a story that says the failure has happened, you’ve now got two minutes to write down all of the answers as to why that thing failed. And because you compress time, people just throw stuff on the paper and then you can go around in a structured way to explore those ideas and then say, have we got this on our risk register?
No. Okay. And it’s a great way of dealing with the emotions we have and exploiting them in a positive way.
Makes sense. So, a lot of very rich topics we touched on CRM, we talked about organizational learning, we talked about decision making. If somebody wants to get in touch with you, Gareth, and get more insights on all of these very rich topics, how can they go about doing it?
So, my website is thehumandiver.com now. It is primarily diving focused, but as I said right at the start, this is just anything that’s out there or [email protected] is the best email address for me. And you can find me on LinkedIn as well, posting pretty much every day and a whole bunch of useful stuff.
And as you said, this is not just about diving. This is about leadership. This is about being safe and organizational decision making.
Absolutely.
Thank you so much for joining us.
Thank you, Eric. I really appreciate the invite.
Definitely. Thank you.
Thank you for listening to the Safety Guru on C-suite Radio. Leave a legacy. Distinguish yourself from the pack. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo consulting.
The Safety Guru with Eric Michrowski
More Episodes: https://thesafetyculture.guru/
C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/
Powered By Propulo Consulting: https://propulo.com/
Eric Michrowski: https://ericmichrowski.com
ABOUT THE GUEST
Gareth Lock is the founder of The Human Diver, an organisation set up to deliver education and research into the role and benefit of applying human factors, non-technical skills, psychological safety, and ‘just culture’ in sports, military, and scientific diving. He has published the book ‘Under Pressure’ and produced the documentary ‘If only…,’ both focused on improving diving safety and performance by looking at incidents through the lens of human factors. While primarily focused on diving, he also works in other high-risk, high-uncertainty domains such as healthcare, oil & gas, maritime, and software. He is currently undertaking a MSc in HF and System Safety at Lund University where he is looking at the power (and limitations) of storytelling to improve learning.
For more information: https://www.thehumandiver.com/
Book: www.thehumandiver.com/underpressure
Documentary (including workshop guide): www.thehumandiver.com/ifonly
STAY CONNECTED
RELATED EPISODE
EXECUTIVE SAFETY COACHING
Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.
Safety Leadership coaching has been limited, expensive, and exclusive for too long.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.