Select Page

Safety Is All About Learning with David East

Dr Nippin Anand_The Power of Organizational Learning

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

Join us to explore the profound lessons and our understanding of safety in high-risk fields with our special guest, David East. In this episode, he brings his deep expertise in Human & Organizational Performance to discuss critical risks, learning from incidents, and the interconnected factors behind them. Drawing on examples from aviation and his experience in the Royal Australian Air Force, David shares his insights on transitioning to a learning zone, emphasizing that safety is all about learning. Tune in to gain valuable insights!

READ THE EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe, yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is the Safety Guru with your host, Eric Michrowski, a globally recognized ops and safety guru, public speaker and author. Are you ready to leave a safety legacy? Your legacy success story begins now. 

Hi, and welcome to the Safety Guru. Today, I’m very excited to have with me David East. He’s an Air Force veteran from the Royal Australian Air Force, as you’re going to hear from his accent very soon, has made a career out of it, but also a huge thought leader in the human factors, human performance space, and hop. So, David, welcome to the show. Very excited to have you with me. 

 Good morning, Eric. Thanks for having me, mate. 

 Great. Let’s start out by your passion in safety. How did it start? Where did your journey begin? 

 Yeah, my background. I joined the Air Force in the mid ’90s as an aircraft technician. I’ve been working around planes my entire career. I worked for 10 years fixing planes. Then I changed over to crew as a flight engineer and a load master on C-130s. Now, my safety journey. It has been an interesting one because when I was young in my early 20s, safety was the furthest from my mind. I don’t know how I didn’t hurt myself more than I ever did. I got through it pretty well, injury-free. But as a young fellow, I didn’t really care too much about safety and just did what I needed to do to comply. It wasn’t until I became an expert That I really started thinking, All right, now I’m in the back of this plane a lot. I have a lot to do with the outcome of this flight, of what we’re doing. I better start paying some attention. But the critical thing that happened to me was I became a human factors instructor, well, facilitator. I was put through the human factors course. Back then it was called CRM, the crew Resource Management. Now we call it NTS, Non-Technical Skills. 

That really got me on the road to safety. It’s all about education. Now, how have I gotten to the point where I’m super passionate about safety? Just realizing some gaps within the workforce. They’re not big gaps. It wasn’t because people didn’t care. I was able to influence people in the safety space and the human factor space, and I enjoyed it. I started my side hustle, and I still do a lot of work within the Air Force around safety. I just find a passion, and it works. 

 One of the things we’ve had several guests come on the show with background in Aviation Air Force. One of the things that I know when we first spoke is you really advocate for a more proactive approach to safety, which I think is so critical. Tell me a little bit more about how we learn from events, but even before an incident happens. 

 Learning from events, that’s really important, isn’t it? It’s the fourth principle of learning and improving is vital. Within defense, everywhere that you go, pretty well has a pretty good safety suite on their software system. Defenses are really good. We have a very good We report everything from the very serious incidents right down to the miners. That safety suite is fantastic. It’s got a lot of data in there. But how do we learn from that? But actually, I From time to time, I have an issue with how we learn from that. Because often what happens is the safety representative within your workplace will normally just send an email. But here’s the critical safety events that have This occurred this month, this quarter, this year. We are supposed to then just read that email and glean the lessons learned from it. I don’t find that as a very good way to learn at all. I think to learn in the safety space, it really is about face-to-face. You’ve got to get face-to-face with people, whether it’s a facilitated session or just having conversations in the crew room with other workers, other people, and just start talking about incidents and see where the conversation goes. 

 That’s where the real learning happens. That, I often find also doesn’t happen as well as it used to, especially around the junior. In defense, we call it we’re all aviators. We run around the junior aviators, that crew room discussion on safety topics just doesn’t happen. We’re generally just learning by osmosis through one or two courses where we learn one or two case studies. And that’s about it. I like to keep the risk conversation alive. Matt Confer from corner industries in the US. He’s coined the term sticky, which is stuff that can kill you. And that’s fantastic. If you go to junior aviators and say, all right, let’s talk about the stuff that can kill you while you’re doing this job, while you’re out there on a Hercules doing an engine change or whatever job it is that you’re doing, what are the risks? What can kill you? A lot of young people, they don’t know. They’re like, oh, I don’t know. You’ve really got to drive that conversation, haven’t you? But once you get them used to those conversations, let’s have a sticky conversation. They understand the risks and they get to own the risks. 

 That helps them learn a whole lot more than an email of the aviation safety accidents that have occurred in the last period. 

 I think one of the things learning, you’ve got a good point that it’s got to be very immersive. It can’t just be an email. One of the things I’ve seen is at least when it is communication, this was in aviation, it’s very targeted to who receives it. If you’re flying a certain type of aircraft and the issue has to do with that certain type of aircraft, those are the people that receive it. Whereas in business, often it’s these mass broadcast, so you get flutter with a lot of things that don’t relate to your job as an example. But the other element is learning from things like the near misses you talked about, some of the little things. In business, I was talking to one audience not too long ago, and we’re just translating it for an aviation, a crash is equivalent. It’s a serious event, just like you would consider a siff event on the ground. At ground level, you’re thinking about serious injuries or fatalities. Is the big thing that could go horribly wrong. But too often people are learning from the cuts, the scrapes, the bruise, fingernails. They’re looking at trends that have no correlation with serious events, and they’re not opening the door to all the other things that may be going. 

 So, they’re not fixing the real issues. 

 It all comes down to, I think if you want to learn from those minor events, you need to understand the controls that are in place and to find out if those controls are effective, I guess, don’t you? Yeah, right. Also, with your point, you talked about how aviation, you might get specific to a specific aircraft type, whereas business, you just get a broad shotgun approach. That’s really interesting because the question is asked, how does this relate to me and how can I relate it to me? That’s a very difficult thing to do. I think we have a very strong reliance on rules and procedures, admin controls. We think that they are the critical controls that are going to save us from a serious incident. I think we also… A lot of organizations, the Air Force, we have a very strong, just culture. 

 Yeah, that’s a key component, right? 

 It’s predominantly no blame, and we try to learn from it, of course, and that’s important. But we still don’t… I was talking about those sticky conversations. We still don’t understand the critical risks and the critical risk controls. We don’t understand why they’re there. If they’re engineering controls, especially, sometimes they’re obvious, but sometimes they’re not. If you’re a brand-new aviator on the flight line and you see a brand-new piece of equipment, you don’t understand why it is built or designed in a particular way. Sometimes because the lessons learned have been built into that vehicle, hopefully, so the critical controls on it work and it makes it simplified. But we need to, from the executive level, top-down management. It’s important. Of course, it’s important. But top-down, they want us to make sure that we’re following rules, which, of course, is important. Those controls are important. But The guys on the floor, they’re going to follow a procedure. Therefore, they’re going to follow the rules, of course. But it’s not until they understand the critical controls and the critical risks. So, when something goes wrong, they’re the ones that have the ideas that can fix that, and they can understand it better. 

 Because if something goes wrong, it’s going to be the worker that’s out there doing the job that’s going to get hurt, isn’t it? We don’t want them to carry that extra risk. We want them to understand it so they can work with it, work around it, and hopefully, fix those that risk, those critical controls. So that next time someone has an accident, hopefully, the outcome is a horrible SIF event or an aircraft crash, hopefully it’s just something minor. When we get the plane on the ground or someone just has a minor injury, and we talk about all these minor events and don’t have a lot of big things to talk about. But that’s a utopian world, isn’t it? 

 Yeah, but it’s something we can strive for. I do believe you can eliminate aviation. You can eliminate serious events. If we look at aviation, I was pulling up some stats in the US several years back, and In the 1960s and 1970s, on commercial aviation, there was roughly one person dying every second to third day. To last 15 years, where only three people in commercial aviation lost their lives. That’s a substantial difference in outcome. If you focus on the right things, you can drive the right bike and it becomes, essentially. 

 I completely agree. You know the aircraft that I predominantly have worked on, the Hercules in Australian Air Force, we have never had We’ve had a fatal accident with him. We’ve been flying him since the late ’50s, early ’60s, I think it’s been with the early. 

 It’s an old plan. We’ve updated the aircraft. 

 Of course. I would hope so. Never had a fatal accident with them, which is a fantastic record. 

 There is a lot of… Obviously, our publication suite, it’s a live document. It’s developed over the years. It brings in live rules. The training is fantastic. We trust it. Another big thing is we trust our aviators to fly them. You might have a junior in their early 20s, a guy, girl in their early 20s flying that plane. We put a lot of trust. That’s the same in all aviation, isn’t it? You train them, you put a lot of trust in them to do the right thing. Aviation is just one of those industries where it’s just done really, really well. 

 Absolutely. You touched briefly on the topic of just culture. Tell me more, because that’s a very key component to get to the near miss reporting so that you can build the learning culture. It’s something that I think a lot of businesses still struggle with because it’s this element of, I need to learn, I understand that, but is there an element of accountability and how do I balance this through just culture? 

 It’s just a word, isn’t it? Just culture. It means a lot. It means that we need to… If something does go wrong, that there is effectively no blame around it, but there also needs to be accountability, and that’s super, super important. 

 That’s the key thing. 

 It is. Blame fixes nothing. Error is normal and blame fixes nothing are the first two principles. And everyone understands that, and everyone can look at them and go, Yeah, that makes sense. But it’s not until you really get into them and try to understand what happens. So, I’m very big on the brain body contract. What happens to a person in their mind and physiologically when something goes wrong and someone points the finger at them, because that hurts. It really does. It wipes down deep. And that’s a core memory that you will keep forever because it brings out an emotion, doesn’t it? A lot of people think that because it’s a no-blame culture, they’re not going to get in trouble. And that’s well as much as far as it’s going to go. But what you get then is you get a lower level of accountability. So, you get people that are just cruising along. They’re just in a happy zone, and they don’t really mind what happens to them. It’s not until you give them… Sure, you’ve got the no blame, but until there’s some motivation, are you motivated to do your job? And guess what? 

 If something does happen, there’s going to be some accountability. Definitely, if you’ve done something wrong, if there’s a violation 100% there’s going to be some accountability. But we need to be accountable for what happens when something does get wrong. And what happens then, if there’s too much accountability and no blame, you have people living in fear of I’m going to work. I don’t want to get in trouble for doing something. But if there’s a healthier level of psychological safety and emotional intelligence around it and a healthier level of motivation and accountability, then what you get, you get yourself up to the top level where you’re really in that learning space. I’m happy to take the hit. I’m happy to put my hand up and say, hey, look, I actually made a mistake there. That was my fault. And this is why I made that mistake. I didn’t follow the procedure. The control was wrong. Whatever led to that mistake happening or whatever the outcome of that mistake was, if it was a bad one, let’s learn from that. But get yourself out of the cruisy zone, get yourself out of the fear zone, and get yourself above all those, and get yourself into that learning zone Where, yeah, I’ll take full responsibility or not necessarily take full responsibility. 

 I’m just a worker. Error is normal. But what was the outcome of that error and what can we learn from it? So that when that error happens again, and it will, probably by me, how do we make sure that no one actually gets hurt by that? And once we’re in that learning zone, you’ve got Then you’ve got a true just culture. It’s important that it’s done correctly. 

 I think one way I lived, somebody was explaining it in a prior episode, is that there’s just culture Sure, it’s about removing blame, but there’s still accountability. Because with the two pilots, there’s very strong accountability on each other that’s built. You touched on CRM. CRM is really about building that accountability, the communication between each other. There’s peer-to-peer accountability that sets in. But the position and length of the choice of words is that there’s still a consequence. But the consequence doesn’t necessarily mean it’s a negative. It may just be We discovered there’s a gap in our training, or maybe you need more training in this particular scenario. We’re going to put you in a simulator with scenarios that are similar around it. You get to walk away and there’s no conversation, essentially. It’s just we want to learn from anything that can happen because when you’re in the air, it’s highly unforgiving if something happens. We learn from it, but there could still be a consequence, just not a negative punishment in the way we normally talk about it. It’s not that we’re avoiding termination or things like that, although it’s much more about the learning that happens there. 

 This episode of the Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, re-energize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com. 

 I’ll tell you one thing I got to do which was fantastic. I’ve been an air crew for many years, but for the last three or four years, I’ve been on the ground doing ground-remanded roles. One of those roles was working with the safety management system within the support system of an airfield. We’re talking about the people who load aircraft, aviation, firefighters, aviation refuelers, and the people that run the fuel farm and all the fuel systems throughout the base. We got to introduce some human factors and got to introduce some safety training courses to that workforce, which was fantastic. It was the best job I’ve It was cool. But there was one crew that worked at the fuel farm, which is where we store all the fuel, the jet fuel. The fuel farm at this base, it’s fairly old. It was built in the 1970s when we bought F-111s because we needed a significant amount of fuel to run those things. They’re awesome jets. It started to show its age. Still works, of course, but it was starting to show its age. By me, What I mean started to show its age, a weld would start to leak. 

 It might just be a pinprick leak. They’d have to fix that, for example. The guys that worked there, they lived in fear. Their accountability was really high, and the blame was really high as well. 

 Whenever something wrong- Which is not good. 

 These guys had only worked there for a year or two, we’re getting caned for it. I rolled in with a couple of my colleagues, and we made a learning team, effectively, which was, this is not an investigation. There is no blame. There are no punitive measures to be taken here. We just want to understand what is happening in your workplace. We put it on a whiteboard, and I said, Let’s learning 10 is, what are your issues? We’re not going to talk about solutions. I only want to hear the problems. And of course, I let them have a pretty good whinge. But the outcome was we got a really good list on the whiteboard of some of the real issues that this problem, that This place was facing. Some of them were admin-based, some of them were engineering, etc. And from that, I was able to go to the commanding officer with this list and say, hey, sir, look, this is what is happening at your fuel farm. He goes, okay, I didn’t fully understand it. I knew we had a problem, but I never fully understood it. From there, they were able… Six or eight weeks later, we had the DG Log, the Director General of Logistics. 

 The guy in charge of logistics for the Air Force. The equivalent for America would be like a two-star general for logistics. He came out and they let… Actually, let one of the younger fellows there explain to him some of the issues because he was the one that understood it. He was really nervous, but he did a great job and explained what was going wrong here. He went, Cool. Now I understand it. He helped with funding to help fix it, and they prepared that. A lot of it, actually, they haven’t just They’ve prepared it. They’ve put in a system to rebuild the whole thing anyway because they need more capacity. That’s a classic example of how going into a workforce and saying, we’re not here to punish you. We’re here to learn and understand. The outcome of that was absolutely fantastic. I saw one of those guys a couple of months ago in Emblem when I was up there, and he said, oh, look, what we started back then was fantastic. I’ve been working around this industry for 10 years, and that day that When you came in and started talking to us in a learning fashion was the turning point. 

 That’s what I put out to everyone. It’s just having a conversation and just try to understand the controls, what’s fouling and how you can fix them. 

 So simple yet so powerful. 

 It’s just human interaction face-to-face. It is powerful, isn’t it? 

 How do you shift leaders’ mindsets around this? You talked about this environment that was previously very high accountability, very high blame. How do you shift mindsets around this? 

 That is a good question, Eric. I don’t know. To be honest, I will be completely honest. I haven’t yet had 100% success in shifting a leader’s mindset as an enduring to an enduring level. I will be honest with that. Maybe it’s my personality. But I think the hot principles, I’ve been through the hop principles with a bunch of leaders, and they resonate with them. But it is so easy. It’s human nature to go back to… It’s very easy. We are hardwired to blame people. We are. You just watch any movie throughout all of our history, we are very hardwired to blame people and to hurt people, unfortunately. I don’t know why that is. 

 But it’s also easier, right? Because the last point of failure is a person. Yes, normally. Even if you go precondition right before that failure, then you’ll get to something that’s maybe they were fatigued, they were stressed, which again, you could easily say, Well, you were the one who’s supposed to sleep, as opposed to for me to really understand the chain of causality, how the system factors came in, all the various things that caused this person maybe to be fatigued and make this error, or the gaps in the training, That’s a lot harder. 

 It is. There will always be a human at the end of it, won’t there? Yes, as you said, given the fact that we come up with these systems in our mind, we invent them, we design them, we manage them, We operate them. We are heavily involved in everything. That’s the human nature. I guess when we are hardwired to blame, when we are hardwired to want to find someone to pin this on so it doesn’t come back to them or the C-suite or the executive suite, it’s all about I guess that’s why I like the hot principles, because so many times I’ve been able to go to my management and say, Hang on, now your response to failure really matters. We had a failure this morning, and the way you responded was probably You need some coaching on how to deal with that response better because you didn’t handle it really well, because those people that were involved in that incident or accident, who really had nothing to do with why it happened, but they were the ones that were the last ones to touch it. You actually made them feel pretty bad about that. But having said that, if you’ve got someone… 

 My current boss is actually quite good, a very intelligent person. He’s super intelligent. A lot of the times you’ll find that people who are super smart haven’t got very good social skills. Well, this guy has actually found the way to have pretty good social skills, so it leads his team pretty well. He sets us up with a level of trust, and that’s what it comes down to. He trusts the worker. I work in an office environment, but we plan missions for all of the AMG aircraft, air mobility aircraft, all the transport aircraft. We plan all the missions. There are so many things that can go wrong with the mission, even in our planning phase. He trusts. There might be a brand-new operations officer who’s straight out of training. They do a couple of weeks of training, and they’ve got someone beside them. They’ve got a little wingman to help them out. But he gives them a level of trust that a lot of young people would never have in executing that mission to make sure it goes off. He can make some decisions at 10:00 PM at night when a C17 breaks down in America that other 21-year-olds might freak out about completely. 

 But if something were to go wrong with that, the next morning when he finds out about it, he’s like, Yeah, okay, that happens. That’s completely okay. What can we learn from it? 

 Which is a red response. 

 It’s perfect. But that has been… I introduced the whole principles to him a number of years ago, and I drive them on him. I hold him accountable for those principles. Hey, you said you like these things, and your response to failure matters. And now his response to failure is quite good. It’s excellent because no one feels punitive like measures from that, which is fantastic. So, it’s a lot about you’re The question was, how do you get leaders to really look after their teams, to not want to be the blame and punishment people? It’s just about having a champion that gets in their face all the time and holds them accountable because that is super important. So once again, it comes back to face-to-face communication and relationships, doesn’t it? Often hard to do because often you don’t have access to all of the executive team, do you? So, you’ve got to be the right person, right person, right place, using the right equipment. It really, really helps. And who understands the big picture. And who can be held accountable when something goes wrong. Yeah, that’s important as well. 

 But I think it starts with a level of ownership that’s very strong. To be able to do this, you have to take very strong ownership of the role and responsibilities and also the flaws. I think as well, the element that I’ve seen, unfortunately, in some organizations, they will take something like hop and say, blame fixes nothing, and use it as a transference of blame. Instead of blaming the employee, now I’m blaming the senior leaders, because the senior leaders created the system that allowed it. But it’s not about shifting blame, it’s about learning. That blame fixes nothing is about just let’s understand the full system and all the interdependencies that are connected. 

 I know. It is so funny how you said, they don’t blame that person, but they blame the next level up. We love to find someone to blame them. 

 We love to, yes. Unfortunately, it’s convenient. It solves a lot of issues when you simply blame. But I think the other element is it’s not about absence of accountability because I think that’s one area where I’ve seen a lot of leader’s struggles. I don’t fire people. I remember I had one person I talked to, and he says, Yeah, we have a great safety program. We call it an at-will safety program. You make a mistake, I fire you. That’s the extreme level of blame. Yes, you’re not going to hear about a single thing that went wrong. You’re just going to have calamities after calamities if you take that approach. But you still have accountability. I would argue that the level of safety ownership I’ve seen in aviation is much higher than anywhere else. That comes from a high level of personal accountability and team accountability between the pilots as well. 

 I’ve got a friend who works for a construction company, and she says that… I’ve done a little bit of work with a few companies outside of my own, with my own business as well. Yeah, you see this all the time. But blame and punishment is just everywhere. It’s a thing. It is very challenging to change that mindset from the top down. Whereas from the bottom up, they’re like, Yeah, we love this mindset. This is great. We don’t want to get punished. But it needs to come from the top down, I guess, doesn’t it? It probably needs to be both. But anyway, my friend has been able to convince the executives for her construction company of the hot principles and learning teams and the 4Ds and having sticky conversations. They said, Yeah, cool. We really love what you’re talking about. It resonates with us. Here’s the red carpet. They basically rolled out this massive, long red carpet for her program and said, Yeah, make that happen. We will support you 100%, which is fantastic. Imagine that. If every Hot professional had that treatment, they’d be very, very happy. I track her I talk to her quite often and say, hey, how is it going? 

 Is it actually working? Because, yeah, are the hop principles, are they just a philosophy? Or are they something- They have to be more than that to work. To be meaningful. 

 They absolutely do, Eric. Are they just a philosophy or are they just a set of values? Or is there some real meat in it that is actually beneficial? She says, no, it actually does work. The workers love the approach where they know when the safety person comes in, they’re not going to get in trouble. They’re not going to be talked about all the things that went wrong. She talks about all the things that go right What is working well, and talk about the critical controls which the workers… She has helped the workers understand what isn’t happening right. And they love that approach because there’s no, hey, look, this happens, so don’t do Make sure you wear your proper PPE. They don’t have those conversations because they don’t have to, because they really trust the workers. And that trust works both ways. And the company seems to be doing really well as a result, which is fantastic. 

 That’s cool. David, thank you very much for your thoughts and insights on HOP and some of your experience from aviation in the military. Any closing thoughts for us? 

 I’ll just reiterate the point. If you want to be a good safety leader, just get out face to face, talk to people, understand them, understand the critical controls, understand the critical risks, and help your workers do the same. You’ve got a whole system of safety behind you. You can use that whenever you need to. But if you have that approach, I think you can do better in the safety world. So good luck in doing it. Eric, thanks for your opportunity for doing this today. It’s been good fun. 

 Good. And if somebody wants to get in touch with you, how can they do that? 

 LinkedIn is best by David East. And I’ve got, because there’s a thousand David Easts in the world, mine’s got a little hand emoji beside it, so I stick out. So, feel free to get in touch. I love having these conversations with like-minded folk and also non-like-minded folk who are happy to challenge me and challenge some of these principles because those conversations are always interesting as well. I have drunk some of the cool aid, but it’s not the only way. There are many ways to do safety. This is just the way that works for me. Happy to have those conversations. It’s good fun. 

 Thank you. Safety is about learning. I think that’s the big takeaway. 

 That’s what makes it sexy, Eric. Actually, safety can’t be sexy. Anyway, thanks, buddy. 

Awesome. Thank you so much, David. Really appreciate it. 

 Thank you for listening to The Safety Guru on C-suite Radio. Leave a legacy. Distinguish yourself from the past. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.   

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

David East is an accomplished Human Performance, Safety Management System & Leadership Consultant with a passion for enhancing workplace cultures, safety systems and optimising human potential. With extensive experience in multiple industries, he operates CrewFusion, a consultancy offering comprehensive training, facilitation, and advice to organisations seeking to improve cultural, safety & leadership practices across any workplace setting.

David’s expertise lies in developing and maintaining robust Safety Management Systems, and he is particularly specialised in the aviation sector. However, his diverse background also includes successful engagements in construction, emergency services, logistics, and healthcare safety systems. His broad industry knowledge allows him to adapt his skills and insights to meet the unique challenges and requirements of any sector.

While running a successful consultancy business, David continues a career in the Royal Australian Air Force (RAAF). Beginning as an Aircraft Technician, he worked hard to rise through the ranks to become Airmen Aircrew serving as a Caribou Flight Engineer and later as a C130J-30 Hercules Loadmaster. This extensive operational experience provided him with firsthand understanding of the critical importance of safety and human factors in high-pressure environments.

For more information: https://crewfusionteambuilding.com.au/

RELATED EPISODE

STAY CONNECTED

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.

Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo

The Power of Organizational Learning with Dr. Nippin Anand

Dr Nippin Anand_The Power of Organizational Learning

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

Are we truly learning from accidents? In this compelling episode, Dr. Nippin shares a different perspective on the Costa Concordia disaster, enriched with his deep insights and research, alongside an exclusive interview with Captain Schettino. He delves into a profound understanding of risk and safety, emphasizing the impacts of culture and shared responsibility. Tune in to uncover valuable lessons about the power of organizational learning and how it can help us make meaningful changes.

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe, yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is the Safety Guru with your host, Eric Michrowski, a globally recognized ops and safety guru, public speaker and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

 Hi, and welcome to the Safety Guru. Today, I’m very excited to have with me, Dr. Nippin Anand. He’s caught my attention because he’s got a deep background in organizational learning. He’s a host of a podcast, Embracing Differences. He has a new book that just came out around, Are We Learning from Accidents? Phenomenal background, phenomenal stories. Looking forward to our conversation. Nippin, why don’t we start maybe with how you got into this space? Because you have a phenomenal story there. 

You mean the title of the book? Yes, it’s very tied up with my life stories. Not just one story, it’s tied up with many stories. I write that in my book as the opening, Eric. Starting off with the idea that failure was never an option for me. I come from a middle-class family in India, a very religious family. I was going to board my first ship as a cadet. This is going back to 1995. The night before I was leaving home, my mom came to me and said that I’m aware that life at sea is very hard and you’re going away for 18 months. Do us one favor, at least. Don’t come back home before you finish your contract and 18 months. That actually put an immense burden on my shoulders. It was a feeling that failure was not an option. I worked for almost about 11 years. I worked a job which I never found appealing from the very first day. And this is very common in many cultures that you end up doing something because of the societal expectations, because of the family expectations. Until I had a near accident, and this was a near collision.

It wasn’t a collision; it was a near collision at sea. We were just approaching a boat first thing in the morning in Japan, and my ship just about touched another ship, also approaching port. It was a very tense situation. And I thought on that night after I finished my watch that because the accident didn’t really happen, there’s no real fuss about it. But the next morning when I woke up, Eric, the whole ship was seeing me very differently. So, from an expert doing a job for almost about 10, 11 years, dedicatedly fulfilling all my responsibilities as a professional, I came to be seen as the idiot of the town. From the next morning, every Everyone, including the seaman who was on look out with me on the night, started to doubt my competence. And at that point, I decided that it was time to… Well, I didn’t at that time, but it took me almost about a year after losing all my confidence to come to the conclusion that this was probably not the job for me. And interestingly, I stayed in that negative spiral for almost about maybe good 8 to 10 years until I came to the UK.

I did a PhD. And after doing the PhD in Anthropology, I discovered the power of narrative and how the same accident could be narrated in so many different ways. And if we truly want to learn, then I think we should be open to all the narratives. Now, Eric, there are very different narratives that we tell in different cultures. I study mythology, I study religion, I study Anthropology. One of the things that you cannot escape is the power of myth, which is belief. Not myth in the way that the Western world understands it. Myth as in what is it that we believe in? What stories do we believe in? If you look at the Christian world, which is where I live right now, I live in Aberdeen. In the Christian world, there are God created the world in seven days or six days. On the seventh day, he put the human beings there. It was a very stable world. The moment you put human beings there and you put your children there and you told them not to do something and they still did it, that’s when human beings became corrupted and could not be trusted.

So, the whole purpose of life is to follow the word of God, which is to follow the process. We call it in safety management system. That’s one way of looking at the world. So, the narrative of an accident investigation is something went wrong because you did not follow the process. That’s the Christian myth. Then there is the Greek myth. The Greek myth is not about compliance. The Greek myth is the opposite. The Greek myth is about defiance. So, there is the oppressor who is all out there to oppress the world, to become a hero, maybe, to attain heights, to attain success in life. You oppressed people. In the Greek myth, your job is to go against the odds. There is chaos out there and you need to create order out of it. So that’s how you become recognized as a hero. So that’s the myth of the heroes and the anti-heroes. People who go to the outside the fringes or the boundaries of the society and do something exceptional, whether it’s good or bad. That’s where you have people like Captain Sully and Captain Francisco Estacio. One becomes the hero because he saves the world.

Another one becomes the anti-hero or the villain, and he actually causes havoc to the world. And that’s the Greek myth at play. That’s how you tell the narrative of an accident or a near miss or whatever you want to call it. And that’s the Greek myth had played. Then you have the Indian myth had played, which is not about compliance or defiance. It’s about self-realization that the world is not stable, and the world is not chaotic. The world is what it is. The important thing is that what do you realize or what do you learn about the world as you go through this journey, what we call life? Sure. In this journey, it’s not It’s not important to understand whether people who were present in an accident, whether they complied with the rules or they went against the rules to save the world. The important thing to understand is when you investigate an accident as an investigator, what do you learn from it? That’s the world of self-realization. I think one of the things I realized is that there is no objective reality out there. It’s how the investigators present the story. It’s how they collect the data, what questions they ask, how they go about processing and analyzing the data, and how they go about presenting the information.

Until the investigator becomes cognizant or aware of their own biases, nothing changes. This applies not just to investigators, it applies to everyone, a leader who goes to the site to engage with people, somebody as an inspector who goes to audit the site or the workplace. The important thing is that we don’t realize we have our own biases. Biases are not a bad thing. Biases is what makes us humans. The important thing is, do we realize that we have certain assumptions, certain narratives, certain experiences, certain qualifications, certain family background, certain aspirations, certain motivations that push us to look at things in a certain way. And a lot of learning opportunities are lost because we are so busy creating objective narratives and making that separation between the narrator and the narrative which does not exist. So, until we become aware that we are biased, and we need to appreciate other narratives, other stories from other people, other point of views, it’s very, very hard to learn anything from accidents and non-events, both. So that was my biggest realization.

Phenomenal. A really interesting story. I’d love to pivot to when you’re talking about incidents, investigations, you had a very unique chance to speak to the captain of the Costa Concordia. I’d love to hear some of the themes, the story, the conversations you had with him along the path we just explored.

Yes. The interesting thing is that because I had a background in safe airing, and I had, at that time, already started blogging, writing blogs. When this accident happened, well, I met him five years after the accident. When that accident happened, my rapport, my blogs, the way I used to write, was very helpful to establish that initial connection with him. I just had to approach him through another person, say that I would like to interview you, and this is my background. And very quickly, he responded with a yes, that yes, it would be nice to meet up. I document that in my book, how we went about it. It was quite easy. I think he also was very appreciative that in those five years, between the time the accident happened, that was in 2012, and the time when I met him, that was in 2017, nobody from the industry had come to speak with him. Nobody. Really? I was the first person who approached him from the industry, apart from lawyers and solicitors, to who genuinely wanted to understand his point of view. So, he was very appreciative of that. From my point of view, the story resonated because what I saw in the press was an extreme form of how I was treated Eric after my own accident at sea.

And there you have an initial resonance already between him and me. Building that rapport and approaching him and making a contact with him was easy. The next thing was, I flew to Sorento, his hometown in Italy, where he was under house arrest, and I spent four days with him. We had a great conversation. We were trying to understand his perspective, not just his perspective, but who he is, his relationship with his community, with his people, with his family, trying to understand him, why he chose a career such as seafaring, and then trying to get to understand how he moved through his career. And then the accident itself. So, a genuine interest in understanding the person even before you dip into the accident case. There’s a reason why I’m saying all this, Eric, because today, a lot of times, we go into accident investigations and audits and inspections, and there is little interest in understanding people. Because there is little interest in understanding people, there is little reciprocation from the other side. If you consider that the origin of all decision making is the unconscious mind, which is the unaware mind, which is the non-rational mind, People will tell you nothing from the non-rational unconscious mind until they see a genuine connection with you, you are making a genuine connection with them.

So, without relationship, there is no learning at all. Absolutely none. And so, getting to know somebody not just as a victim of an accident, but also as a father, as a brother, as a community member, as a husband, as a sibling, is so important before we get to understand why they did what they did on the day of the accident. Because then they can tell you things that you were probably not even expecting. And that’s the beauty of learning, that learning is a discovery. Learning is a discovery, and discoveries can only happen when we find something that we were totally unaware of. And that can only come when we make a genuine connection with people and listen to the unconscious mind of these persons. Yeah, it’s very important. Something we consistently miss in our accident models, whether old ones or contemporary ones, doesn’t matter.

Because you jump straight to the decisions, the occurrences, and you’re trying to track back to a very linear cause an effect. But what did you gain from that? Because four days is a lot of time. You’re really trying to understand the person, you’re trying to understand what was going through his mind, indirectly his mental model, which touches the biases. How does that lead to something different in terms of what transpired?

The biggest learning from this accident was that people involved in accidents, both as part of the experience of the accident, but also as part of being investigated, are traumatized. This is a very traumatic experience, both Not just experience an accident, but also going to an investigation. The first thing I would say is that if you do not know, if you’ve not been trained in trauma, if you have not been trained in how to handle a distressed person, never go into an investigation. Because that’s precisely what happened to me, and it took me nine years to come out of that cycle. It’s the first question, usually, that you ask from the person, sets the tone for the rest of the investigation. If you don’t take the time to connect with the person, you don’t take the time to understand the person, if you don’t see learning and healing should come together, then all you do is come back with data, come back with some extracted information, which is nothing but a story, a very carefully crafted story of a rational mind, a very logical story that people want to tell you because they know that’s what you want to hear.

And so, there is no learning. And a lot of times, we are missing that very crucial point that go with an open mind, connect with the person, listen to them, try to understand their stories. Try to understand what is it that shocks you the most when they’re telling the story. And there you have a very rich story from the person. And do not interrupt at all as the person is speaking, because this is something… When they are giving you something from the unconscious mind, just sit there and absorb as much as you can. There is no need to interrupt. There is no need to feel apprehensive about the silences. There’s no need to feel uncomfortable about things that go against your values, against your culture. Just sit there, be in the moment, and listen as much as you can with open questions. Things like, what would you like to share? Where would you like to begin? What? Walk me through the steps. What have you learned? These are very, very open questions, avoiding any probing, any prompting, even the person goes silent. That’s a very important thing. But Eric, Underneath all this is a methodology which is very important to understand that human beings are fallible.

People will make mistakes.

Of course.

You must embrace the fallible When you’ve seen another person, an imperfect person, just like you, you make the connection there and then. But if you’re so busy trying to fix this person, trying to find a solution, and I think quite often those reactions stop you from listening anything. You say something that you didn’t like, and you make a face, you make a gesture, and it just disconnects you from that person completely. It’s very important to go with a philosophy, with a methodology that accepts another person’s imperfection and fallibility. From there onwards, the flow begins, and you start to listen to the full story. So, your question was that what did I learn from this story? Many things, but I deliberately chose to focus on four important things. The first one was this question that, why did he choose to navigate so close to the land?

Sure.

That was the first thing that… As I spoke to him, Eric, what intrigued me was that here we are sitting so far away, so distanced from the reality, and this person does not see that as a risk at all. For him, this is a normal practice. This is what a cruise line captain does each day, every day. He balances the competing goals between customer satisfaction, which is go close to the land, and safety of navigation, which is to keep a distance. It is important that we pay attention to these things when somebody says that it’s quite a normal practice to do such a thing. Now, the important thing is that words like normal practices have become very fashionable these days. Everyone wants to study normal practices, normal work whatsoever. What we often forget is that your normal is not my normal. You are sitting in the ivory tower. For you, what is normal is completely different from what I see as normal. It’s very important that when we are speaking to people, when we are engaging with them, we try to understand what’s their normal. So on occasions, people might say things like, Oh, it’s okay.

It’s how we do it here. Oh, it’s quite normal. It’s usual. What they are telling you is that’s their culture. In that moment, we become very excited that how can this person go so close to the land when he should be 20 miles away from the land or whatever is documented in the safety management system. What the industry struggles with is the idea of subjectivity and risk tolerance, which is very individual, very subjective to each person, each culture, for that matter. If I show you a video of how people drive cars in India, in the West, and people get shocked. But equally, people get shocked in the West, in India, when they see how cars are driven in the Western world. The point being that every culture has its own normal. Absolutely. If you don’t understand the power of a worldview, a culture, then you are so far away from what is normal and what is normal work. It’s fundamental to understand. Today, when we talk about the idea of work, we don’t understand that aspect of what is normal. Normality often comes from the idea that this is my belief, this is my myth, this is my paradigm.

When I say it’s okay means that it’s consistent with my worldview. It’s consistent with my culture. It does not surprise me because this is something that has become embodied, it’s become part of my body. So, this is important. And that’s something that I highlight in the book. It’s a major part of the book, which is to understand culture through the lenses of what we consider as normal. And every culture has its own normal. We then go into the idea of why people don’t speak up. That’s another topic, and I spent a lot of time. I devoted a lot of time, actually, to it. Eric, we talked about the idea of myth, the Greek myth and the Christian myth, and the Indian myth. What happens is that people don’t realize that this whole idea of why people don’t speak up is very much the Greek myth at play. What do I mean by that? You have this #MeToo campaign in your country every now and then. We have the people in position of power, and they do things, they do injustice to people, and somebody must rise up to the occasion and stop the oppressor from doing wrong things and speak up.

What most organizations today try to do is create what we call psychological safety so that we can empower the oppressed to speak against power, which becomes abusive over a period. This is the Greek method playing, the oppressor and the oppressed. So, you have to defy the oppressor. This goes back to the narrative of defiance. This is not compliance, this is defiance. And what’s interesting in this narrative is that we pay too much focus to the individual. So, if we can empower people, if we can give them psychological safety, then they should somehow speak up. What became very apparent, what became very clear in the Costa Concordia study, and I’ve studied many, many accidents after that in aviation, in health care, in other areas, what is happening is that there is a very… Take the example of aviation. If you take the Ethiopian Airlines air crash, which happened a few years ago, and then followed by the… Sorry, I can’t remember the second one.

Lion air was first, yes.

Lion air was first, yeah.

In both instances, in the Ethiopian Airlines, for example, you have a pilot with more than 10,000 hours of experience sitting next to a co-pilot who had 200 hours of flying experience. Now, you cannot challenge this because from a certification point of view, both are certified. They are both-Correct. The co-pilot is certified for what it does, and the pilot is also certified, of course. The trouble is that we are not talking about hierarchy here, which has always existed in aviation and the maritime and many other industries. What we are talking about is hierarchy blown out of proportion completely. You have an expert who is far too powerful in this game against the novice who has just entered the profession. This is a systemic problem in the industry, and it cannot be solved through the lenses of psychological safety. You cannot send somebody on three days, five days course, attend a course in psychological safety or crew resource management, whichever way you like, and empowered them to speak up because they just… And we go back to the idea of normal, because these two people belong to two different subcultures or cultures, and they see things very differently.

For a captain who’s navigating the ship so close to the land each day, every day, it’s his everyday work, right? That’s what he does every day. He doesn’t see it as a risk. For the novice, because he sees the captain doing it every day, he does not dare to challenge it because he knows if things go wrong, he cannot handle it. What will he say and what will he do? So, what we are dealing with here is not a problem of speaking up or speaking out or listening in. What we are talking about There’s two different subcultures that see the world differently. And until we create that awareness, the problem is not with people not speaking up. The problem is that people don’t know when to speak and what to speak and how to speak. There’s nothing we can do about this. So that was a There was another theme that came up in the book, which is why don’t people speak up? And I explained that through the book. Sure. We then look at the idea of the emergency plans and processes and why they don’t work in practice when an accident happens. So, in this instance, one of the big things that came out was that sense-making in an accident, which is trying to move forward with limited information, sometimes conflicting information, time pressure, language difficulties, the reality of life, are so distanced from documented plans and processes.

And to be able to live through that trauma of an accident. And the interesting bit is not just the comparison between the documented plan and procedures and how emergency is handled in practice. The painful bit is that information and that behavior being taken to the court of law and compared against what is documented in the process, and then establishing a case for culpability and crime based on those processes is absolutely fascinating. To understand that, one has to come to terms with the idea that there’s a huge difference between how people make sense of the crisis, what it means to be a human being in a crisis, let’s put it this way, what it means to be a human being in a crisis and how you are judged or misjudged based on your behavior as part of the court proceedings that part of the investigation is something to think about, Eric. Yes.

This episode of the Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, re-energize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com.

That was the third aspect of it. The final aspect of which really worries me, Eric, is this idea that We have this slogan that you can either blame or you can learn. One of the things that comes out from this accident, and again, many other investigations that I’ve done, is that there is escape from blame. This is more than, well, this is more than 3,000 years old ritual that started in Israel, which we call scapegoating. And unless you acknowledge that in an accident, in order to give meaning to human sufferings, somebody will be blamed. You have to accept that. And once you accept that, you know that there are better ways to put your resources, better areas to allocate your resources, and not spend too much time on accidents of this nature because they will never create the desired results that you want. So, you’d be better off putting your resources into other meaningful areas because you cannot avoid blame, you cannot avoid scapegoating. It’s inevitable in an accident. So, I use the example of the Costa Concordia, but you could easily use any example. And I think this idea that blame fixes nothing needs to be challenged because if we did not blame the captain on the night, the whole cruise industry would have come to a complete halt.

Somebody had to be blamed. When the ship capsizes, somebody has to be blamed. That person may or may not have anything to do with the accident, but that’s how the idea or the ritual of scapegoating works. So, these were the four main areas that I concentrated upon. I’m very happy to take your questions, but just to end, in the end, I actually provide a method on how we should learn from accidents and how can we learn from accidents. And the basic idea is that stop looking for improvements in processes. Of course, those things are important, then they are a by-product of many other things. But do not, as a default, start to look for processes, systems, fail-safe systems, technologies to make the world a safer place. Of course, do not turn to people who have been involved in an accident. Rather, turn it around to your own self to say, okay, what sense do I make of this accident, of this misfortune, or whatever has happened? Until we take that question to our own selves, there’s nothing that will change. The trouble today in most organizations is that we talk about organizational learning as if somebody else has to learn.

That question has to flipped around to our own selves to say, okay, I’ve been on this site, I’ve investigated this accident, I’ve been on this audit, I’ve done this inspection. What has changed in my world after this? And until we address this question, which is back to the myth that I talk about self-realization in the start. So, it’s not about compliance, it’s not about defiance, it’s about my own world. What has changed in my own world after being through this experience? And that’s what the book talks about. So, I talk about my own self, basically, this journey of discovery, learning, and how it has changed me as a person. To me, that’s the most important bit, and we don’t talk about it often. Interesting.

I think one of my main takeaways is the conversations you had with the captain in terms of getting to know the person, not just trying to connect dots to get to a pretty report on the back end. I think that is a very key component, particularly with the comment around most of our decisions are subconscious If it’s subconscious, you need to understand the person to get there. I think that makes sense. What I do disagree is around the blame fix is nothing. What I mean by this is if I think, particularly in the aviation space, a lot of the removing of blame is so that I hear about the near misses, the things that almost happened that I wouldn’t have known without that information. If we think about the near misses, there were very few self-reported near misses prior to removing the blame in aviation. I think that’s an interesting and very important point is to hear about the things that didn’t actually cause an incident, but we could address. If I think in aviation, we fell asleep, we both fell asleep, things of that nature, which I’ve shared before, I wouldn’t self-report that if the blame would turn at me.

There’s generally no consequence if I’m on autopilot that I fell asleep, assuming I’m still on course and it’s not a long-extended period of time. But it then allows me to understand what we are doing in the system that allows fatigue to build up or creep up.

The trouble with taking blame as the focus is that invariably, whether you call it a system or an individual, there is something to fix. So, it may not be the frontline person, as you said, but now it’s not the person who fell asleep. It could be a monitoring technology that failed to give us the desired data. Or maybe this problem could be fixed using a technology or a system or a process or a protocol.

Or training or other pieces. 

Or training, yes. I have no problems with that. Well, I have a couple of things to think about. I’ll leave you with a couple of things to think about. One is that what we are doing is we’re externalizing learning. Instead of taking the person who fell asleep as a victim, there is somewhere else in the system that needs to be fixed. We are still externalizing the learning. It’s a process or a system or a technology or a barrier that failed that needs to be fixed. Now, the trouble is that you can fix that, you can fix that, or you can put another barrier to it, and you never know what new problems you might have created as a result of that.

Yeah, fair.

As a result of When you’re fixing one problem, you might have created another one. And who knows when it might show up. Potentially, yeah. The other thing is that because you have now allocated the problem to somewhere else in the system, you think that that’s the end of it. That’s what we call event learning. So, we learn because we have sorted out the event. And I think that’s dangerous because you still have, I haven’t asked the question, how has it changed my view about the accident, about the person? If your view does not change, if your attitude towards failures does not change or the fallible person doesn’t change, nothing really moves. I’m not against the idea of fixing things, but quite often what happens in blaming or fixing things is that we end up externalizing the problem and we never actually take the I am to reflect upon how this has changed us as a person. And learning, in true sense, relates to change. Change, not in the outside world, change in the inside world. And unless that connection is clear between learning and change, nothing changes. So, I’ll give you an idea. Sure.

Spinoza, if you look at the back cover of my book, there are four monkeys here. There are four monkeys here. And these four monkeys are basically the result of… When I finished the Costa Concordia accident, I conducted several workshops around the world. I went around the world. At one point, I even did some work with Todd Conklin, and we did some workshops together. Some of the themes that started to emerge from those discussions was that people would either respond by making a joke of the captain during the workshops, or they would have They would finger-point at him that he did something which he shouldn’t have done, or they would sympathize with him, apologize for his situation, or they would try to suggest some fixing that maybe he should have done this, maybe the company should have done that, and this shouldn’t have happened. So, this is interesting because as I was doing my research, the Dutch philosopher, Bark Spinoza’s book, one of his quotes stuck me, which is not to laugh, not to lament, not to curse, but to understand. And to me, until you move away from the idea of spending too much time thinking about whether we should blame this person or not, whether we should fix this person or not, laugh at this person or not, and take it back on yourself to say, what have I actually learned from this?

Nothing changes.

But you can combine the two. Yeah, but you could combine the two in the scenario of the pilots that fall asleep. There can be an internal learning because it could be some elements of, I made some choices, didn’t get a good night’s sleep, things of that nature that I chose from a career standpoint, which is an internalized where I’m not blaming, but there’s a self-realization of what was my part, my contribution. Then there’s also the element from a system standpoint to say, how often is this happening? What measures do we need to have to counteract? I’m hearing about new messages that otherwise we would never hear about. I can be addressing it at the individual level, at the cultural level, and the system level, which I think then addresses the learning piece. 

Absolutely, yes. You said something powerful here that if that realization comes from the person himself as part of interviewing or as part of reflection after the interview, that’s immense learning. That’s liberation. That’s healing, actually. If when this person comes to tell me that, we had a wonderful interview together and you asked some good questions, and as you were asking those questions, this is what I came to realize, that’s liberating. You have moved one person. Of course, you have learned something, and that person has learned something. But again, going back to the idea that if you’re not asking open-ended questions, if all you have on mind is to fix this person and determine how many processes he breached or how we can fix the problem through the social context or the technological context, whatever, then there is very little learning, very little, if any use.

And I think where I would agree is if you’re transferring the blame from the individual to the system, which means I’m transferring it to somebody else, I’m still blaming. Absolutely. And I think that’s the one challenge I have with the element of… I fully agree, don’t blame the individual. But there’s an element of learning. But shifting the blame to say, okay, now it’s all the senior executives that made the wrong decision, is still blaming. It’s just a different person because it’s just system indirectly means other people. I think there needs to be a balance between because it needs to be system learning because nothing happens because of one person. Also, at an individual level in terms of that realization and how do I change.

What I’m suggesting is, and I’m not suggesting these things are not important, What I’m suggesting is, as a default, the first question to ask is, from an investigator’s point of view, what have I learned after conducting this investigation? I think if we don’t begin there, nothing changes. Because at the end of the day, the report is someone’s view. With your bias, yes. It has to be. There is nothing objective in a report. You can have all the micro details, not enormous amount of data supported by timelines and facts and evidence, don’t take anything away. And still, it is somebody’s view, somebody’s worldview. So that is something important. So, unless that person shifts his worldview or her worldview, nothing really changes. If they still see in that accident… And this is interesting, Eric, because when organizations are stuck by failures one after another, I think it’s time to slow and ask, are we stuck with our questions? It’s so important.

That’s a good point. I think that’s a key element because, like you said, our decision Decisions are mostly subconscious-driven decisions. So, unless I understand the context, I’m trying to find clues. And as an investigator, I have a bias in what I’m doing, and the organization has a bias in what they’re doing. And to pull back from it, I think, it is really key.

Yes, very important. And the other important thing in this journey is that you cannot escape your belief system. You just cannot. This morning, I was interviewing… No, a couple of days ago, I was interviewing somebody involved in an accident. And he’s a Muslim guy, and he was a frontline worker. I don’t like the word. And he was involved in an accident, and he kept saying the same thing. I said, what have you learned from this towards the end, from this experience? And he said, Well, it’s God’s will. What can I do? I can’t do anything about it. And here you are, pushing him to follow permit to work system, a job hazard analysis, toolbox talk. He will do all of that. But does he really believe in that? And I think This is something we consistently ignore. To some cultures, an accident is a quest, or the solution to an accident is to find a root cause and put a collective action. In certain cultures, it’s the God’s will. It’s happened because of the God’s will. When you employ a multinational crew, you have people from around the world. It is very, very important for investigators to become comfortable with the idea that different cultures see misfortune or accidents differently.

Differently. Very differently. Yeah.

And it’s important to appreciate that. 

I think that’s an interesting point as well. Nippin, fantastic conversation. If somebody wants to get in touch with you, pick up your book, how can they do that?

Eric, my book is available on Amazon, I believe, in Barns & Nobles. It’s called, Are We Learning from Accidents: A Quandary, a Question, and a Way Forward, because there is a method in the end to show how to investigate accidents. Apart from that, I would say I’m active on LinkedIn. I have my own website, which is nippinanand.com. I have my company, Any website called novellus.solutions.

Excellent. Well, really enjoyed having you on the show today, and look forward to maybe continuing the conversation another time 

Sure. It’s a pleasure for me as well. Thank you for reaching out and making the connection.

Thank you for listening to The Safety Guru on C-suite Radio. Leave a legacy. Distinguish yourself from the past. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.  

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Dr. Nippin Anand is a former master mariner with a master’s degree in economics, a PhD in Social Sciences and Anthropology and a desire for life-long learning in the wider disciplines of humanities, social psychology and philosophy. After a near collision at sea, he took up a passion for investigating accidents and helping leaders understand the importance of perspective in human failures. As a former subject matter expert at DNV, Nippin also developed an interest in making compliance meaningful for achieving business goals. He is the host of the podcast Embracing Differences, blogs regularly and is recognised both in the research community and across safety critical industries for his ability to make research accessible to businesses and people at work.

For more information: https://novellus.solutions/about-novellus/

Are-we-learning-from-accidents-Combo-Book

RELATED EPISODE

STAY CONNECTED

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo

Bringing Human Factors to Life with Marty Ohme

Bringing Human Factors to Life with Marty Ohme

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

There’s a safety decision behind every chain of events. We invite you to join us for a captivating episode of The Safety Guru featuring Marty Ohme, a former helicopter pilot in the U.S. Navy and current System Safety Engineer. Don’t miss this opportunity to gain from Marty’s extensive expertise and insights on system factors, organizational learning and safety culture, and effective risk management to mitigate future risks. Learn from the best practices of the U.S. Navy, as Marty brings human factors to life with real-world examples that can make a difference in your organization.

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe, yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is the Safety Guru with your host, Eric Michrowski, a globally recognized ops and safety guru, public speaker and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi, and welcome to the Safety Guru. Today, I’m very excited to have with me, Marty Ohme. He’s a retired naval aviator, also a system safety engineer. He’s got some great stories he’s going to share with us today around human factors, organizational learning. Let’s get into it. Marty, welcome to the show.

Thank you. I appreciate the opportunity to spend some time with you and share some interesting stuff with your audience.

Yeah. Let’s start maybe with your background and your story in the Navy.

Sure. I graduated from the United States Naval Academy with a bachelor’s in aerospace engineering. I’ve been fascinated with flight and things that fly since a very young age, so that lined up nicely for that. I went on to fly the H-46 Delta and the MH60 Sierra to give your audience an idea of what that looks like. The H-46 was flown for many, many years by the Marine Corps and the Navy. It looks like a small Chinook, the tandem motor helicopter. Then the MH-60 Sierra is basically a Black Hawk painted gray. There are some other differences But both aircraft were used for missions primarily for logistics and search and rescue. Then we did a little bit of special operations support. There’s a lot more of that going on now since I retired than I personally did. Then I also had time as a flight instructor at our helicopter flight school down in Florida. After my time as an instructor, I went on to be an Airbus on one of our smaller amphibians’ ships. Most people think of the Airbus on the big aircraft carrier. This is a couple of steps down from that, but it’s a specialty for helicopter pilots as part of our career. Later on, I went to Embry-Rural Aeronautical University, and they like to call it the Harvard of the Skies to get a master’s in aviation safety and aviation management. That was a prelude for me to go to what is now the Naval Safety Command, where I wrapped up my Navy career. I served as an operational risk management program manager and supported a program called the Culture Workshop, where we went to two individual commands and talk to them about risk management and the culture that they had there in their commands. Since retirement from the Navy, I work as a system safety engineer at APT We do system, software, and explosive safety. If you want to figure out and understand what that means, the easiest way to look at it is we’re at the very top of the hierarchy of controls at the design level. We sit with the engineers, and we work with them to design the things out or minimize the risk and the hazards within a design. You can do that with hardware, you can do that with software. And then explosives is a side to that. I don’t personally work in the explosives division, but we have a lot of work that goes on there for those things.

That’s Marty in a nutshell.

Well, glad to have you on the show. Tell me a little bit about organizational culture. We’re going to get into Swiss cheese and some of the learning components, but culture is a key component of learning.

Absolutely. So military services, whatever country, whatever environment, they’re all high-risk environments.

Absolutely. Specific to the Navy, my background, if somebody’s hurt far out at sea, it could be days to reach high-level care. It’s obviously improved over time with the capabilities of helicopters and other aircraft, but you may be stuck on that ship for an awfully long time before you can get to a high level of care. That in and of itself breeds a culture of safety. You don’t want people getting hurt out at sea because of the consequences of that. When I say culture of safety, in this case, a lot of people hear culture, and they think about language like English or Spanish or French or whatever the case may be. What food people eat, what clothes they wear, those kinds of things. Here, what we mean is how things get done around here. There’s processes and procedures, how people approach things, and the general idea. In fact, the US Navy is in the middle of launching a campaign called What Right Looks Like in order to try to focus people in on making sure they’re doing the right kinds of things. Something that’s been around the Navy for a long time and is specific to safety is using the word mishap instead of accident.

Sure. Because in just general conversation, most people will think, well, accidents happen? Really, we want a culture where we think of things as mishaps and that mishaps are preventable. We really want to focus people on thinking how to avoid the mishap to begin with and reduce that risk that’s produced by all the hazards in that high-risk environment.

In an environment like the Navy, it’s incredibly important to get us tight. You talked about what right looks like. But you’ve got a lot of very young people joining a very young age who can make very critical decisions at the other end of the world without necessarily having the ability to ring the President for advice and guidance at every call that happens. But tough decisions can happen at any given point in time. Tell me a little bit about how that gets instilled.

Sure. Organizations have to learn, and they have to learn from mistakes. These high-risk environments, you have to… When something goes wrong, because it will, you need to ask yourself what went wrong and why. In these kinds of environments, and you think about it, then that’s what leads to a mishap investigation. Then in order to do that learning, you have to really learn. You’ve got to apply the lessons that came out of those investigations. Then that means you have to have good records of those mishaps. I mentioned the naval safety command. That’s part of the responsibility of naval safety command is to keep those records and make them useful to the fleet.

Sure. We’ve just touched a little bit on building a culture of learning, how the Navy does it. Let’s talk a little bit about Swiss cheese. We’ve touched on Swiss cheese a few times on the podcast, so most listeners are probably familiar with it, but I think it’s worthwhile to have a good refresh on it.

Absolutely. As I mentioned about having good records, if the records aren’t organized well or structured in a way to make them effective, then it’s going to be very difficult to apply those lessons. As an example, if there’s a vehicular mishap, commonly referred to as a car accident, but we’re going to use the mishap virology here. If you have three police officers write a report on a single vehicle mishap, they’re all going to come out different, probably. One of them might say the road was wet, one of them might say there was a loss of traction, the third one might say that the driver was going too fast. It’s a lot more difficult to analyze the aggregated mishap data if every investigator uses different terms and different approach. This is where Swiss cheese comes into play, and it’s the follow-on. The follow-on works. Dr. James Risen provided a construct that you can use to organize mishap reporting with the Swiss cheese model. In his model, the slices of cheese represent barriers to mishaps. He also identified that there are holes in the cheese that represent the holes in your barriers. Then he labeled them as latent or active failures.

Latent failures are existing, maybe persistent conditions in the environment, and active failures are usually something that’s done by a person, typically at the end. His model has four layers of cheese, three with latent failures, and one with active failures. So, no barriers, perfect. If we look at our vehicle mishap in that way, if you start at the bottom, let’s say it’s a delivery driver. They’ve committed an unsafe act by speeding.

Sure.

Why did they do that? Well, in our scenario, he needs a delivery performance bonus to pay hospital bills It’s because he has a newborn baby. He’s got this existing precondition to an unsafe act. Sure. Well, prior to him going out for the day, his supervisor looks at his delivery plan, but he didn’t really do a good job reviewing it and see that it was unrealistic. Sure. The thing is that the supervisor sees unrealistic delivery plans every day. It’s ingrained in him that this is normal. All these people are trying to execute unreasonable plans because the company pay is generally low and they give bonuses for meeting the targets for a number of deliveries per day. The company, as an organization, has set a condition to encourage people to have unrealistic plans, which the supervisor sees every day and just passes it off as everybody does it. Then we roll down and we have this precondition of, I need a bonus because I have bills to pay. This is the way that the Swiss cheese model is constructed. A little bit later on, Dr. Chapelle and Wegman developed the human factors analysis and classification system or HFACs.

They did that by taking reasons for slice of cheese, and they named the holes in the cheese, the holes in the barriers, after they studied mishap reports from naval aviation.

Tell me about some of those labels that they identified.

Some specific ones that they came up with are things like there was a lack of discipline, so it was an extreme violation due to lack of discipline. Sure. That would be at the act level. A precondition might be that someone was distracted, for example. Sure. A supervisory hole would be that there was not adequate training provided to the individual who was involved in the mishap. Then overall organizational culture, it might just be that there’s an attitude there that allows for unsafe tasks to be done. That sets everything up and through all the barriers to put our individuals, sets them up for failure and the mishap. We You see that in our delivery driver rec example where there’s all decisions, everything at every level, there’s a human decision made. There’s a policy decision. There’s a decision made to accept all these unreasonable plans. There was a decision that, okay, I must have this bonus. Now, that one, you saw if you could argue that one back and forth, but there was also a decision made to violate the speed limit, and that’s your active one down at the bottom. Yeah.

These helped essentially a taxonomy so that there is more standardization, if I’m hearing you correctly, in terms of incident investigations and classifications of learnings.

That’s correct. The decisions in this stack and the Swiss cheese come together. As you’re alluding to, there’s a taxonomy. So, Chapelle and Wegman, after, I think it was 80 mishaps in naval aviation that they were able to assign standardized labels. Those are the labels that became the names for the holes in the cheese. Once they put it in that taxonomy, they found 80% of the mishaps involved a human factor of some sort. I personally argue that there’s a human factor at every level, even if you go back and look something like United Flight 232 that crashed in Sioux City, Iowa, it all rolled back even to where there was a flaw in the raw metal that was used to machine the turban blade that ultimately failed. Sure. Did they make a decision not to do certain inspection on that block of metal before, and then it just keeps going down the way. There’s a decision in every chain of events.

Also, no redundancy in terms of the hydraulics, from what I remember in that incident.

Right. A design decision.

A design decision, exactly. That’s a great one. I like to use that as an example for many things, but we won’t pull that thread too hard today. But all these human factors, all these decisions, this is why in the US, the Department of Defense, uses HVACs as a construct for mishap and reporting so that aids in organizing the mishap reporting and the data so we can learn from our mistakes. It makes actionable data. There are other systems that also have taxonomies. Maritime Cyprus collects data. I ran across it when I was preparing for something else. Their number one, near miss, shows situational awareness as a factor in those things.

Situational awareness is a tough one to change and to drive.

It is. It’s a lot of training and a lot of tools and those kinds of things. I bought a new vehicle recently, and it likes to tell me to put the brakes on because thinks I’m going to hit something because it thinks it’s more aware than I am. It did it to me this morning, as a matter of fact. But it can be an interesting challenge.

Yes. Okay. Let’s go through some examples. I know when we talked about You had a couple of really interesting ones, Avianca, Aero Peru. Maybe let’s go through some of those examples of human factors at play and how they translate into an incidence from an aviation standpoint.

Sure. Avianca Flight 52 was in January of 1990. The aircraft was flying up to JFK out of Medellín, Colombia. The air crew received their information from dispatch about weather and other conditions as they were getting ready to go out on their flight. The problem was dispatch gave them weather information that was 9 to 10 hours old. Also, they did not have the information that showed there was a widespread storm that was causing bad conditions through a lot of the up and down, a lot of the East Coast. The other part was dispatch there had a standard alternate they built for JFK, which was Boston, Logan. Boston, Logan had just as bad a condition as JFK. They weren’t going to be able to use that in ultra, but they didn’t check. Then the air crew didn’t check either. They didn’t confirm how old the forecast was. They didn’t do any of those things. They launched on their flight with the fuel that was calculated to be necessary for that flight. For those who are not in the aviation world, when you’re calculating your fuel for a flight, you got to be able to get to your destination, what you think you need for your destination, what you’re going to need to get from there to your alternate in case you can’t get to your destination.

Then there’s a buffer that’s put-on top of that. Depending on what rule you’re using, it could be time, it could be percentage. It just depends on what rules you’re operating under and what aircraft you’re in. They have X amount of fuel. They launch out on their flight where they had 158 people on board. They get up there, and because of the weather, things are backed up JFK all the way up the East Coast as well. They can put in a hole near Virginia for quite some time. Then they get put in a hole when they get closer to JFK. They tried to get in a JFK, and they had a missed approach. They couldn’t see the runway when they did the approach and they had to go around. To go back into holding. The captain, understandably, is starting to become concerned about their fuel state. Sure. He’s asking the co-pilot if he has communicated to air traffic control what their fuel situation is. The co-pilot says, yes, I have. Well, the nuance here is that the international language of aviation is English, and the captain didn’t speak English. Co-captain did, and that met the requirement of one of them to be able to speak English to communicate with the air traffic control, but the captain didn’t know exactly what the co-pilot was telling air traffic control.

Well, that becomes a problem when the co-pilot was not using standard language. He was saying things like, hey, we’re getting low on fuel. That’s not the standard language that needs to be used. Correct. You have two phrases. You have minimum fuel, which indicates to air traffic control that you can accept no unnecessary delays. He never said minimum fuel. When they got even lower on fuel, he never used the word emergency. So, air traffic control did not know how dire the situation was. They It did offer them an opportunity to go to their alternate at some point, but by then they were so low on fuel, they couldn’t even make it to their alternate, even though Boston, the weather was too low there anyway for them to get in. Ultimately, they had another missed approach. They were coming around to try one more time, and they actually ran out of fuel. They ran the fuel tanks nearly dry on approach, and they crashed the aircraft in Cove Neck, New York.

Wow.

Here we have an aircraft, and you would think that there would be… There’s almost no reason for an aircraft to run out of fuel in flight, especially an eyeliner. But with these conditions that were set, they did. Just as an aside, there were 85 survivors out of the 158, and a lot of that had to do with the fact that there was no fire.

Because there’s no fuel to burn.

Because there’s no fuel to burn. I understand this It had a positive impact on what materials were used in aircraft later on, specifically cushions and stuff like that that don’t produce the toxic fumes when they burn because they could show that people could survive the impact. It was the fire and the fumes that were killed. That’s just an aside. That’s the overview. If we back up a little bit and talk about what human factors rolled into play here. Dispatch had this culture. It was an organizational culture. It wasn’t like it. Sure. They used as a general policy to use Boston, Logan as the alternate for JFK. That was just the standard. They didn’t even check. They may or may not have been trained properly on how to check the weather and make sure that it was adequate for either for an aircraft to get into its primary destination or to its alternate, because the forecast clearly showed that the conditions were too poor for the aircraft to shoot those approaches. That’s an organizational level failure, and you can look at that as being that’s one slice of cheese. If we start going a little bit further down without trying to look at every aspect of it, if we look at what the pilots did, they didn’t check the weather.

They just depended on dispatch and assumed it was correct. Then once they started getting into this situation that they were in, there was communication in the cockpit. That was good, except it was inadequate. More importantly, the pilot couldn’t speak, was the only one in the cockpit that could speak English, so the captain didn’t have full situational awareness, which we mentioned a moment ago. Then he failed to use the proper terminology. That was a specific failure on his part. I don’t know. We can’t say if that was because he didn’t want to admit they were… If he didn’t want to declare an emergency because he was embarrassed, which is possible. He didn’t want to have to answer the captain, perhaps. If you had declared an emergency and ATC comes back and ask them later, why did you declare an emergency? Why didn’t you just tell us this stuff earlier? We don’t have those answers. Unfortunately, those two gentlemen didn’t survive the crash. But these are all things that can roll into a roll into that. When you break it down into HVACs, these preconditions, maybe he was embarrassed, maybe he felt that there was a power dynamic in the cockpit that he couldn’t admit making a mistake to the captain.

Then he had the active failure not using the correct language with ATC, the standard air traffic control language.

It feels as some CRM elements, some psychological safety, probably at play because you would expect the co-pilot to at least ask, do you want me to declare an emergency or something along those lines. For seek clarity if you’re unsure.

Absolutely. That’s a really interesting one to me. I use it as an example with some regularity when I’m talking about these kinds of things.

This episode of the Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, re-energize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com.

How about Aero Peru? Because I think the Avianca one is a phenomenal, really interesting one. Actually, one I haven’t touched on much before. So, it’s a great example of multiple levels of failure. How about Aero Peru?

Aero Peru is another one that’s really interesting. It had a unique problem. So, the short version, just to give an overview of the flight like we did with Avianca, Aero Peru was flying from Miami, and they were ultimately headed to Chile, but they had a stopover. They had stopovers in Ecuador and Peru. During one of those stopovers, they landed during the day, and then the plane was scheduled to take off at night. During that interim time, the ground crew washed the aircraft and polished it. Then the aircraft launched. They got up a couple of hundred feet off the runway Anyway, and the air crew noticed that there was a problem with air speed and altimeter. It wasn’t reading correctly. Well, they were already in the air. You can’t really get back on the ground at that point. You’re already in the air. They flew out and they’re out over the Pacific, and they get up into the cloud. Now they’re flying on instruments, so you don’t have any outside reference out there. Just even if it was clear, flying over the water at night is a very dark place. They got out there, they’re flying on instruments.

Their attitude indication is correct, but they know their altimeter is not reading right, the airspeed is not reading right. There’s another instrument in the cockpit called vertical speed indicator. It also operates off air pressure, just like your altimeter and your airspeed indicator.

Sure.

They’re very confused. To their credit, they are aviating. In the aviation world, we say, Aviate, navigate, communicate. Because if you stop aviating, stop flying the aircraft, you’re going to crash. To their credit, they aviated, They navigated, they stayed out over the water to make sure that they wouldn’t hit anything because they just didn’t know how high they were. Then they started talking to air traffic control. They’re very confused by all this that’s going on. There is on YouTube at least one video where you can listen to the cockpit recording, and then they’ll show you what else is going on in the cockpit. We don’t have the video, but they represent it electronically so you can see it. It’s interesting to listen to the actual audio because then hear the confusion and the attempts to make decisions and determine what’s going on. Ultimately, they get out over the water. They know these things are not right. They are asking air traffic control, hey, can you tell us our altitude? Because our instruments are not right. The problem with that is that the altimeter tells a box in the aircraft called the transponder. I sometimes call it the Marco Polo box when I explain it to people because the radar from the air traffic control sends out a ping like a Marco, and then the box comes back with a Polo.

But the Polo is a number that’s been assigned, so they know who the aircraft is on radar and the altitude. Well, the altimeter feeds the altitude to the transponder, so air traffic control can only tell the aircraft what the air altimeter already says. But that didn’t occur to anybody, and they’re under high stress, and this is a unique one. So, it’s just as an aside, my only real criticism of the air crew is you have a general idea of what power settings you need and what attitude you need for things, so they didn’t really seem to stick to that. But we all have to remember that when we’re looking at these, we’re Monday morning, quarterbacking them. I don’t ding them too hard. At any rate, long story short, they’re trying to figure out how to get turned around and go back. They’re trying to figure out what’s going on. Ultimately, they start getting overspeed indications warnings from the aircraft that’s telling them they’re going too fast, and they’re getting stall warnings from the aircraft.

At the same time?

At the same time. They don’t know if they’re going too fast or too slow. Overspeed is based on air pressure, which obviously all their air pressure instruments are not working properly. But stall warning is a totally separate instrument. It looks like a weathervane. If you walk onto aircraft at the airport today across the ramp, you may see the little weather looking thing up near the nose. That’s what is there for us for stall warning. They actually were stalling because they were trying to figure out how to get down and slow down since they were getting altitude and speed indications that were higher and faster than they wanted. Their radar altimeter, which again does not work, which also does not work on air pressure. It actually sends a radar signal down, was telling them they were low. They were getting, I’m high, I’m low, I’m slow, I’m fast. All this information coming at them.

That would be horribly confusing at the same time.

Horribly confusing, and there’s alarms going on off in the cockpit that are going to overwhelm your senses. There was a lot going on in the cockpit. Ultimately, they flew the aircraft into the water and there were no survivors. What happened here? When they were washing the aircraft, in order to keep water and polish out of the ports called the static ports that measure the air pressure at the altitude where the aircraft is at that time, had been covered with duct tape. Then the maintenance person failed to take the duct tape off. They forgot. Then when the supervisor came through, they didn’t see the duct tape either because that part of the aircraft, it looks like bare metal, so it’s silver. So, the gray or silver duct tape against the server, they didn’t see it. The pilots did not see it when they primly to the aircraft. So, when the aircraft When it took off, those ports were sealed, and the aircraft was not able to get correct air pressure sensing. Now we have to ask, how in the world did this Sure. Right. If you want to put it in a stag, start looking at slices of cheese, we have to ask these questions.

Why was he using duct tape? Was it because they didn’t have the proper Plug, which would have had a remove before flight banner on it? Was it they didn’t have it, or was it just too much trouble to go get it because they have to check it out and check it back in? Was this normal? Did they do this all the time? Did the supervisor know that and either not care or, hey, this is how we get it done around here. That’s a cultural piece. Sure.

At least use duct tape that’s flashing red or something.

Something. When you start looking at it in those terms, you have the, Is there a culture? Was there a lack of resources? Was there not adequate training? They didn’t know they shouldn’t use duct tape. It just seemed like the thing to Then the supervisor, did he know they were using duct tape? If he did, and it was for one of these other reasons, like resources or whatever the case may be, why didn’t he look carefully to make sure the duct tape wasn’t there because he knew they were using it? Did the air crew know that that’s how they were covering the static ports? Then when you get into the stuff with the air crew, they tried to do the right things. As we talked about, it was a very confusing set of circumstances. Like I said, standard attitudes and power settings would have been helpful. This is how these things stack up and how those holes line in the cheese to give you that straight path for a mishap to occur. It’s just a pretty interesting example of it.  

And multiple points of failure that had to align.

Absolutely.

Because assuming the duct tape was not used just that one time, this probably many times where it was used before and didn’t cause an issue because they removed it prior.

Correct. Correct.

Fastening example. So, the last one I think you’re going to touch on is around non-aviation going into maritime, the Costa Concordia.

Correct. This was from 2012. A lot of people probably remember the images of Costa Concordia is rolled over. It’s rolled over on its side. It’s heavily listing. It’s run aground off an island in Italy. This one is truly human from beginning to end. No equipment failed. There was nothing wrong with the ship, anything along those lines. That’s part of the reason that it’s such a good example here. The captain or the ship’s master, depending on how you want to use it in your terminology that you’re going to use, decided he was going to… They got underway with passengers on board. He decided he wanted to do what was called a cruise by where he would sail close by an island, specifically a town on the island, so that he could show off for his friends and wave at them when he went by.

Always a great idea.

Yeah. Most dangerous words in aviation, watch this. He decided he was going to do this, and he had done it before at the same place. But there were some differences. One, the previous time it had been planned. He briefed his deck, his bridge crew, what is going to happen. They checked all the weather conditions, et cetera, et cetera. It was during the day when he did it the first time. This was at night, and he just decided on a whim as they were on their way out that he was going to do this. As they’re sailing in there, they actually hit an outcropping as they were approaching the town It ripped a big old gash down the side of the ship. I think it was about 150 or 170 feet long, if I recall correctly, or about 50 meters. That caused flooding in the ship and a power loss. Then they ended up, as you saw in the photos, and 32 people lost their lives. That’s a real brief overview. But what I want to do here is talk a little bit more about what led into We’ve talked very generally about slices of cheese in holes.

Sure. For this one, I’m going to go into a little bit more detail and use some actual HVACs codes or names for the holes and names for the slices of cheese. When you look at the at the Cruise Company itself, the attitude there seemed to be this captain was getting the job done. When that happens in an organization, somebody gets the job done is obviously has a little bit higher… They’re regarded in a better way than people that don’t necessarily get the job done. The problem comes when that individual is doing in an unsafe manner. Maybe they’re hiding some stuff about how they’re doing it. They’re doing things that are unsafe, but they’re getting away with it. You have to watch out for those things in an organization, excuse me, and for what people may be doing how they may be getting things done. At that level, he was accomplishing things. So organizationally, you have that. Then you can call it organizational or supervision in that next slice of cheese, depending on how you want to look at it. They probably didn’t provide adequate training. In the aviation world, we use simulators a lot. They’re using simulators a lot more in the maritime world now as well, and they can put an entire Bridge crew on a simulator together and practice scenarios and practice their coordination.

Well, they hadn’t had that with this crew. They failed to provide that training. This captain had an incident pulling into another port where he was accused of coming in too fast, which if you do any boating at all, you might see or might be going by a lake or whatever, you might see buys that say no wake zone. Well, the belief is that he pulled into this port too fast, created a wake, and that damaged either or equipment or ships. There weren’t any real serious consequences for him on that. So, they may have failed to identify or correct risky or unsafe practices. Sure. Then that’s, again, if they didn’t identify it, then they didn’t retrain him. Now they failed to provide the adequate training for him, failed to provide adequate training for the Bridge crew as a whole. Now we’ve hit organizational with the culture, we’ve hit supervision with the training on safe practices. Now we go into the preconditions for the next level. Complacency. He decided on a whim, essentially, that he was going to do this sail by. So didn’t check the conditions, those kinds of things. He didn’t consider the fact that it was… 

We’ll get back to that one in just a second. Let’s see. Partly because, or partly maybe because the crew didn’t have the training in one of these Bridge simulators, there was a lack of assertiveness from the crew members to him. That may have been because he was known to be very intimidating. He would yell at people when he didn’t like the information or when they told them things that weren’t correct. Rank position intimidation is one of our holes. Lack of assertion is a hole. Complacency, he didn’t think this was a big deal. And distraction, and this one’s very interesting to me personally. One, he’s on the Bridge Wing, which if you look at a ship, you usually have the enclosed Bridge. Then outside from that, you’ve got a weather area, weather deck, where you can see further out, those kinds of things. He’s standing on the Bridge Wing on the weather deck, talking to one of his friends ashore on his phone. Hey, look at us. Look at we’re coming by. Just get ready. Here we come. Then part of the distraction was there were ships guests on the Bridge Wing with him, which was a violation of policy to have guests on the Bridge Wing when they were in close proximity to shore.

And he had his girlfriend. Excuse me. His mistress. He was married and he was having an affair and had his mistress on the ship with him in violation of policy. So, he had all this distraction going on in addition to he just thought of this as no big deal. So now we’ve covered three slices of cheese, and let’s get to the last one, the ax. So, we have an extreme violation, lack of discipline, where we talked about all these preconditions, and those are examples of lack of discipline as well, where he failed to focus on what he was doing, allowed these distractions on the bridge, et cetera. And inadequate real-time risk assessment, day versus night. I checked the weather, I didn’t check the weather, et cetera. In this case, this is one where we’ve taken the codes, the names of those holes in the cheese and apply them to this specific case. There’s a whole lot of stuff with this one. There’s a reason that mishap reports are hundreds of pages long. But this one comes down to these examples of codes where he violated all these things. That was just before they actually had a problem.

It got worse after that, if you all are familiar with that case. Yeah.

Well, phenomenal story, but very applicable to other industries because there’s a lot of other industries where somebody is known for getting it done and might be doing some risky things in getting it done, just hasn’t been an event or a mishap, and people are not paying attention to those things. How did you actually get the job done? Or in the case of the driver, you’re talking about, the delivery driver, maybe he historically got it done, cutting corners, and they just decide not to look at some of those cutting corners.

Right.

Right. Festinating. So really good illustration, I think, in terms of culture, learning, and then Swiss cheese in terms of how different layers come together. Swiss cheese is not cheddar cheese. It has holes in it. It’s just a matter of those holes can line up at any given point in time. They’re existing.

Right. That’s where the latent versus active conditions may be. In the case of DOD and H-Facts, you have the organizational supervision and preconditions. Those are all your latent layers, and then your active layers, that last thing. In this case, where the extreme violations occurred in the inadequate real-time risk assessment.

I think the part I also like about Swiss Trees is it forces people to look at beyond the aviator, beyond the ship’s captain, beyond the team member in an organization that makes a mistake to the latent conditions that are linked to decisions that the organization has made over the time. These people in finance, people in HR, people in a corporate office are making decisions, not necessarily connecting to how it impacts somebody in the field. We don’t know about Aero Peru, but maybe it’s even somebody where in procurement, they forgot to buy the proper tools to do it and use what you have to because you go on to get the job done. A lot of conditions that impact other people in the organization. I think that’s also another reflection in Swiss cheese for me.

Absolutely.

Great. Any closing thoughts that you’d like to add?

Sure. Just a couple of things. Aviators are, on the whole, willing to admit their mistakes. It’s because we know that it’s a very unforgiving environment. The ocean and aviation are very unforgiving environments. As an attitude, as a culture, we want to share with others so they either don’t make the same mistake we did, or they understand how we got out of a situation. If you look at Aero Peru, I mean, seriously, has anybody else had that problem ever where there’s duct tape, I run the static ports? I don’t know, but by talking about-Never heard of them. Yeah. By sharing this story, we have the ability to help others avoid that situation in the future. That’s really the way that we do it. The second thing that’s big in aviation is we’ve always had… The way that we really made big improvements in safety in our MSAP record is by planning and talking about these things. Somewhere later, somebody came along and named this the P-bed process, planning, briefing, executing, and debrief. But we’ve been doing it for decades. You actually have a flight You may not execute to that plan specifically, but at least you have a plan to deviate from, I like to say.

Sure. Then you brief it so that everybody understands what’s going on. Then obviously you go and execute it, and you may have to make changes to it along the way. That’s fine. When you come back, let’s debrief it. Hey, we had this mission. Did we accomplish it? Did we have any problems? What did we do well? What did we not do well? So that we can improve later. That really helps in a lot of ways, in a lot of industries or situations, if you just talk about what you’re going to do to plan it out and make sure everybody understands. When you plan it, if you have the right people involved, they can come up with solutions to problems that you see in planning. They may identify a problem that you see that you can avoid in the planning stage instead of running across it in the execution stage. So that planning, briefing, executing, debriefing is a real useful thing to have out Something that can be transposed in any other industry as well in terms of really thinking through the planning.

I think your point around the voluntary reporting is huge because having been in aviation, you hear about things that people would rather not talk about. I fell asleep, things of that nature. But if you don’t know about it, you can’t do anything about it because unless the plane crashed, you would have no knowledge that both pilots fell asleep unless they went off course dramatically. Chances are nothing’s going to happen because they’re going to be on autopilot and it’s pre-programmed and all good. But if you know something’s happening, you can start understanding what are the conditions that could be driving to it.

Right. Absolutely.

Excellent. Well, Marty, thank you so much for joining me today and for sharing your story. Pretty rich, interesting, and thought-provoking story with really good examples. Thank you.

Happy to be here.

Thank you for listening to The Safety Guru on C-suite Radio. Leave a legacy. Distinguish yourself from the past. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.  

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Marty Ohme is an employee-owner at A-P-T Research, where he works as a System Safety Engineer. This follows a U.S. Navy career as a helicopter pilot, Air Boss aboard USS TRENTON, and program manager at what is now Naval Safety Command, among other assignments. He uses his uncommon perspective as both engineer and operator to support the development of aerospace systems and mentor young engineers. Marty holds a Bachelor of Science from the United States Naval Academy and a Master of Aeronautical Science from Emory-Riddle Aeronautical University. He may be reached through LinkedIn.

For more information: https://www.apt-research.com/

RELATED EPISODE

STAY CONNECTED

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.

Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo

Cracking the Code: The Human Factors Behind Organizational Failures with Martin Anderson

Cracking the Code The Human Factors Behind Organizational Failures

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

You don’t want to miss our latest episode of ‘Cracking the Code: The Human Factors Behind Organizational Failures’ on The Safety Guru. Join us as Martin Anderson, a renowned expert on human factors and performance, shares his valuable insights and examples about the human factors behind organizational failures. Learn how to effectively and constructively embed lessons learned in your organization.

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost, for the C suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi, and welcome to The Safety Guru. Today I’m very excited to have with me Martin Anderson, who’s a human factors expert. We’re going to have a really interesting series of topics of conversation today. He’s got a deep background in human factors across oil and gas regulatory environments. His passion is really to understand how people perform in complex systems and also, ultimately, why organizations fail. So, Martin, welcome to the show. Really excited to have you with me. Let’s get started with a bit of an introduction.

Yeah, thank you very much, Eric, and certainly, thank you for having me on the show. It’s a real privilege to be invited here. Yeah, so in terms of my background, I started off with a psychology degree, and then I did a master’s in human factors. And after a few years of work experience, I followed that up with a Master’s in Process Safety and Loss Prevention. I’ve been a human factors specialist for over 30 years now. I’ve worked for a couple of boutique consultancies. I’ve been a regulator working as a specialist inspector in human factors for the UK Health and Safety Executive. I spent a few years as a human factors manager in an oil and gas company. I spent a lot of time assessing existing installations but also had input into the design of new facilities, working on 40, 50-billion-dollar mega projects. And over that time, I visited over 150 different oil, gas, and chemical facilities, both onshore and offshore, which gave me quite an insight into how some of these major organizations operate. And one of the reasons I created the website humanfactors101.com, was to share some of those insights. The other thing I’d like to talk about is going back 30 years, right to the start of my career.

I read a document which was called Organizing for Safety. It was published by the UK Health and Safety Executive in 1993. There’s a quote from that document I would like to read out because it had a huge impact on me at that point. It goes like this, different organizations doing similar work are known to have different safety records, and certain specific factors in the organization are related to safety. So, if we unpack that quote, it really contains two statements. First of all, the different companies doing the same things have got different safety records. And secondly, perhaps more importantly, there are specific factors that could explain this difference in safety performance. And I thought this was amazing. I thought if these factors could be identified and managed, then this safety could be massively improved. And over the next 30 years or so, one disaster at a time, these organizational factors have revealed themselves in major incidents, which I guess we’ll come to in a moment.

I think that’s a great topic to get into. So why do organizations fail? Because I think when we had the original conversations, I was fascinated by some of your connections between multiple different industries and common themes that were across all of them.

Yeah, sure. What might be helpful, first of all, because we introduced me as a human factors specialist to just briefly define what we mean by human factors, and then we’ll go into looking at some of the organizational incidents if that’s okay. Sure. For me, Human Factors is composed of three main things. We’re really looking at, first of all, what people are being asked to do. That’s the work they’re doing. Secondly, who is doing it? This is all about the people. And thirdly, where are they actually working? Which is the organization? So ideally, all three of these aspects need to be considered, the work, the people, and the organization. But my experience is that companies tend to focus on just one or two of these, usually the people one. Within the UK HCC, our team defined human factors as a set of 10 topics, which has become widely known as the top 10 used by industry consultants and regulators worldwide. Because prior to that, we would turn up to do an inspection, say, we’re here to inspect your human factors. And they were like, I don’t know what you mean. How do we prepare for that?

Whom do you want to speak to? What do you want to go and look at? So, after creating that top 10, we were able to say, the agenda for the inspection is that we want to come and look at how you manage fatigue. We want to come and look at your supervision arrangements or your competency assurance system. So, this helped to operationalize human factors. So, the other description, really, of human factors. A lot of people come to human factors through human error. They hear about human error. But if we identify human error, we need to understand how and why it occurred and not simply blame people. Are we setting people up to succeed? Are we setting them up to fail? Are we providing systems, equipment, and an environment that supports people to do the work that we’re asking them to do? And to introduce, as we move towards talking about organizational failures, I’d like to read a quote from Professor James Reason, who is a psychologist at the University of Manchester. And this quote is about 25 years old, but it’s still one of my favorites. And Reason said that rather than being the main instigators of an accident, operators tend to be the inheritors of system defects created by poor design, incorrect installation, faulty maintenance, and bad management decisions.

Their part is usually that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking. And I think that’s a really good introduction to our discussion on organizational failures.

So, let’s go there because we had a really interesting conversation on organizational failures and some of the common themes. So, what are some of the common themes, and why do organizations fail?

Exactly. When you say, why do organizations fail? Let’s just think about a few of those from different industries because these organizational disasters have occurred to the NASA space shuttles, the Harold of Free enterprise, Ferry Disaster, Shenandoah, the Kings Cross Fire, Piper Alpha, Caterpillar, Texas City, Burnsville, Deepwater Horizon, the Condo, lots of different rail incidents around the world, several so-called friendly fire events. And there’s also been organizational disasters in sectors such as healthcare and finance. In the UK, these include inadequate care during children’s heart surgery at the Bristol Royal Infirmary over a 10-year period. And, of course, most listeners will be familiar with the so-called rogue trader that caused the collapse of Bearings Bank. So, there were so many disasters in so many different industries. And I know when we had a conversation earlier, what we were considering was that, okay, they’re all in different industries, but there are lots of common themes that we could pull out of those from space shuttles to Bearings Bank, for instance.

So, what are some of the themes? Because I think the part that really caught my attention is I think you’ve done an activity where you had taken the facts from a different event, mastered it, and told me a little bit about that story in terms of how you mastered the facts that were from an existing element and people thought it was something different.

Yeah. So, the example there was that… I don’t know if readers are familiar with the Nimrod disaster. So, this goes back to 2006. Nimrod was a recollonance aircraft. And shortly after air-to-air refueling, it was on a routine mission over Afghanistan. Shortly after that refueling, there was a fire which led to the loss of the aircraft and, sadly, the 14th service personnel. And I was asked to get involved and advise that investigation. And as I started to read some of the initial information from that investigation, I started to think, this sounded just like another incident I’m really familiar with, which was one of the shuttle incidents the Columbia incident. So I put a presentation together, and on one side of the slide, I put the information from the Nimrod incident, and on the right-hand side of the slide, I put information from the Columbia incident. And then, I went through several of the issues that were involved, and I produced this PowerPoint presentation, and I mixed up the left and right sides, and I didn’t say which was in which. And when we showed it to the investigation team, they couldn’t determine which information came from the incident they were investigating from the NIMOD incident and which information came from the shuttle Columbia incident many years previously. 

It just showed you the two very different incidents in different industries, different locations, and different people, that the organizational issues were almost identical. That was quite powerful, the fact that people couldn’t tell the difference between the facts from one and the facts from the other because these causes just overlap so much. When you look at the very detailed technical level, there are differences between these events. But the common factors when you really start looking at the deeper or, the broader organizational issues, then there are so much many similarities.

What are some of the themes in general that you’ve looked at? You mentioned Bearing’s Bank, which sounds very different than Piper Alpha. What are some of the common themes?

It does. You think, what has the failure of a 100-year-old bank got to do with the failure of an oil refinery or an offshore oil platform or any of the other incidents that we’ve spoken about? People and organizations fail in very similar ways. The findings from these disasters are getting quite repetitive just because you’re seeing the same things over and over. When you look at all of these incidents and pull out some of the main themes, what are the things that we’re seeing? Because the important thing is that we can go and look for these in an existing organization. You see things like a lot of outsourcing to contractors without proper oversight. We call that in the nuclear industry, we call that not having intelligent customer capability because they don’t know what the contractors are doing. They can’t explain what the contracts are doing. Then you’ve got inappropriate targets or priorities or pressures because, in almost all of these cases, there were significant production pressures, whatever production means for your organization. Another key issue that you see almost every time is a failure to manage organizational change. And by that, I mean a failure to consider the impact of that organizational change on safety.

So, a lot of organizations are going through almost like a tsunami of changes and not really considering how that impacts how they manage safety or not considering that each of those separate changes has a cumulative effect which is more powerful than the individual changes. You also see a lot of assumptions that things are safe. So even if you have evidence to the contrary, assuming that everything is safe, rather than going and looking for information, rather than challenging, or rather than having a questioning attitude, organizations are pretty bad at looking for bad news or responding to bad news, not wanting to hear bad news. So in almost all of the incidents that we’ve spoken about, it wasn’t a complete surprise to everybody in the organization. There were people in the organization that knew things were going wrong, that they were getting close to the boundaries of safety, but they couldn’t either get that information to be heard by the right people, or people didn’t react or respond to that. So it’s really interesting when you look, and you read the detailed investigation reports, and there are always people that knew that things were going wrong. So that information is available in the organization.

And I think that’s a good thing because that means that, hey, this is good. We can proactively do something about this. We can go and look for some of these things. So the things that I mentioned there, and there are a lot more, Eric, that we could talk about. There are lots of organizational issues we could proactively go and look for because these incidents are devastating for the people involved, for the organizations involved, but they’re a free lesson for everybody else. Sure.

If you choose to learn from them and if you choose to see the analogy between a space shuttle, Nimrod, and Barings Bank, and whatever industry you’re in.

Yeah, exactly. Because you have to go looking for those issues, for those factors in your organization, so, there are two things or maybe three things you mentioned there. So, you need to go looking at other incidents. You need to take the lessons from those. You need to go and look for them in your organization, and you need to act on that. So, this failure to learn from other industries, for me, is perhaps the greatest organizational failure of all. The organizations think, well, it doesn’t apply to me because that was in a children’s hospital, or that was a bank, or that was an offshore platform. What’s that got to do with me in my industry? Failure to learn those lessons is the biggest failure because you can get away from the technical specifics of the incident and just try and look at the deeper organizational issues. But who in organizations is doing this, Eric? Which person, which role, which part of the organization goes looking for these events and draws the lessons and then goes and challenges their own organization? It’s actually quite difficult to do that. It’s like the problem with safety, isn’t it? Really, is that you can go into a boardroom and you can pitch a new product to a new market, and people give you money, and they’ll listen to you.

But if you go in and pitch that you want to spend money to protect and safeguard the installation against things that may or may not happen in the future is a much harder sell. It’s a problem for safety more generally.

One of the things I know we talked about was around what you call organizational learning disability, so people are good at investigating, but not true learning, and not embedding the change. I’ve seen this many times where people learn the same lesson over and over.

And that’s it. When we have these large investigations into these disasters, there’s always this proclamation that this must never happen again, and we need to learn the lessons. And then something else happens a year or two later in a different industry, but the same issues. So, you talked about a learning disability. Why do organizations fail to learn? Given that, there’s this wealth of information out there available as to why organizations fail. For me, I think there are two issues. I think there’s this failure to learn from other industries. All industries think they’re unique. They don’t think that they can learn because it’s a totally different industry. It’s nothing to do with them. But they all employ the same kinds of people. There aren’t different people working in different industries. They all employ the same people. They organize themselves in very similar ways, and they have the same targets and priorities and so on. So, first of all, that assumption doesn’t apply to me. It’s a different sector. So, failure to learn from other industries, we’ve spoken about, but failure to learn from your own investigations. And we see this in major incidents like NASA failing to learn from the previous incidents it had.

So, you have the Mars orbital and failure to learn from that. You have Challenger, then Columbia, and so on. So, what we find is that there’s a lot of sharing but not enough learning. So, after an incident, then there’s a safety bulletin put together, it goes on the intranet, there might be a bit of a roll, and so on. But you’re not, actually… If you’re not changing something, you’re not learning. So, something in the organization has to change for a lesson to be embedded. And you need to go back and confirm that you’ve changed the right thing. So, you can’t just change something and assume everything will be okay. So if you’re not changing anything structurally in the organization or in one of the systems or one of the processes, then you’re not embedding the learning. So that’s the first thing is this failure to embed the lessons that you come up with. I think the other problem is that investment derogations are not always of great quality. They’re not identifying the right issues. They may not be getting to the root causes. They might focus on human error. They might focus on blame. And Investigations that are done by external bodies generally are starting to look at these organizational issues.

But investigations that are done internally by the organizations themselves into their own events rarely confront organizational failures. It’s very challenging for the investigation team to raise issues that suggest there are failures at the leadership level. It’s challenging for the investigation team, and it’s challenging for the leadership to receive that information. So quite often, the recommendations and the actions are all aimed at employees, a bit like a lot of safety initiatives, behavioral safety, safety culture, and so on, are quite often aimed at the front-line workforce rather than the whole organization. We often see that in investigations as well if they’re not challenging these organizational issues, whether that’s because of a lack of understanding or whether or not that’s not accepted by senior leadership. Because people doing these investigations aren’t always competent. And I mean that in the nicest possible way. They don’t have the right experience, or they’re not given enough time, or it’s seen as a development opportunity. So, investigations need to have the right people doing them, asking the right questions in order to get the right recommendations out of them. Because if the process isn’t right, you’re not going to get the right recommendations coming out of it.

So, what are you going to learn because you haven’t got to the real issues? So yeah, I think there are two issues there, failure to learn from other industries, but also failure to learn from your own investigations. And we can talk about some tips that maybe could help organizations get to some of those organizational issues when they’re doing investigations. Absolutely. And also, it’d be useful to talk about how you can go and look for some of these organizational issues before you actually have an incident, which is what we want to get to. We want to have it, we want to learn, but we don’t want to have incidents in order to be able to learn. So why can’t we learn proactively without having an incident in the first place?

This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety, and safety to your advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com.

Let’s start first in terms of how you can identify some of these organizational factors through the investigation process.

Through that investigation process, what you’re really trying to do to get to the organizational issues is you’re trying to zoom out from the detail, taking a helicopter view. You’re zooming out and looking down, trying to see this bigger picture. So, for example, most people who’ve done an investigation would have put together a timeline. So, a list of what happened to who or what equipment and when and draw a timeline and start to map what happened. But the problem is that a lot of those timelines start on the day of the event. And what I’d propose is that your timeline goes back to weeks, months, or even years before the event occurred. You’re trying to identify what might have changed in the organization in that period in terms of changes to equipment, processes, people, priorities, the direction the company was going, and so on. So, your timeline needs to go way back because of the organizational issues that we see in all of these events. These events didn’t just occur overnight. As Reason said in that quote, there was trouble brewing for weeks, months, and years beforehand. So, there are indications in the organization. So, your timeline needs to go back and look for those issues.

That automatically forces you to think not just about the actual incident but more widely about your organization. The other thing you can do really is review previous incidents that have occurred or other sources of data, maybe looking at audits or regulatory inspections, or staff surveys. You’re trying to identify common threads and trends, and you’re trying to identify how long these conditions have existed and how extensive they are across the company. Why did this event surprise us? Because, as I say, the information is normally available in the organization. So why did this come as a surprise? You’re looking not just at individuals, but you should be looking at systems. You should be looking at processes, and your mindset as an investigator should be thinking about what were the organizational conditions. What was the context in the organization that set people up to fail? So that going back way before the incident is quite a helpful change of mindset for people, rather than just going, okay, what happened on this day? And thinking about how you responded to the incident. It’s quite a useful tool to help you think more about organizational issues.

And how broad do you go? Because when you start going back to Zoom out years before decisions, changes in leadership, changes in investment, you can open up a very big can of worms. And I see if it’s Deep-Water Horizon, Piper Alpha, that there’s a need to go deeper. But how deep and how wide do you cast the net? Because I think it’s incredibly important like you said. Otherwise, you just limit to that person that made a mistake as opposed to start understanding what’s changed in the environment, the context. Sure.

It’s a lot easier in those big disasters to do that because they’ll have a huge team of people in these investigations. Some of them have taken five, six, eight years. They have the time and the resource. In an organization, you generally don’t have that much time to do an investigation. Quite often, the people doing it have other jobs, so they want to get back to the day job. So, it’s one of the reasons why the investigations are quite compressed in terms of time because most people are not full-time investigators. So, I think what you can do is it depends on the incident that you’ve had as to how far you want to go back. But I think looking at whether or not those conditions exist in other facilities or workplaces is a useful step that can really help you identify whether this is unique to this scenario or is this a systemic issue that we have in our organization. Organization. I think going back and looking at what might be key issues, so if you’ve had a merger or an acquisition or a major change in your direction or a new product or you’ve opened a new facility, those major organizational changes, if you had a downsizing exercise two years ago and since then there’s obviously been issues in terms of staffing and resources, then those are the key things you need to need to be mapping out.

As you say, you can’t map everything, but you’re looking for key significant changes or events or shifts in priorities or policies that might have occurred in the previous years. And I guess the time and effort that you spend in that partly depends on the consequences or the potential consequences of the event that you’re looking at.

But there’s still an element of you can focus the conversations like you just said in terms of what are the major shifts that happen as opposed to unearthing every piece. You’re still rewinding the movie further back. The other part I think is interesting to explore is what you talked about in terms of how we know and explore some of these organizational factors before something happens. And you mentioned that in all the incidents, you talked about somebody who knew something was up before. So how do we identify these themes before a major event?

Yeah, you’re right there, Eric. I think there’s always information available, and it’s just maybe not getting to the right people, or people aren’t taking action on it. So, these warning signs, red flags, whatever you want to call them, they’re unnoticed, they’re ignored or not getting to the right person because, as we’ve said, these incidents incubate over a long period of time. Those warnings accumulate. And that’s a great thing because that means that we have an opportunity to go and look for them and to find them. So, if you start looking, first of all, you should have a means for people to be able to raise those concerns in an independent, confidential way, some reporting system so that those concerns are coming to you. So that’s like one mechanism is some industries are much better than others at having confidential reporting systems where people can safely report a near miss or an error or challenge or frustration that they’re having. And that gives the organization an opportunity to do something about it. You’ve got to have the right culture for that, of course, because if your previous investigations blame individuals, then people are not going to come forward because they’ve seen what’s happened to other people.

So, they’re going to keep quiet, and these things get brushed under the carpet. So, it does depend on the culture that you’ve got. But having an independent, confidential way for people to raise those issues can be quite useful. So that allows issues to come to you. But you also need to go looking for these issues as well.

Yeah, I think.

That’s important. Organizations have had quite a few events. So, do they investigate them individually, or do they try and join the dots between different incidents? They might appear unrelated, but are they? Are you starting to accept things, either conditions or behaviors, that you wouldn’t have accepted a few years ago? People’s risk acceptance might change over time. Are you contracting more out? And do you really understand the technical work that those contractors are doing? Can you explain it? Can you challenge it if necessary? Are you having lots of budget cuts? The conversation is always around targets, budget challenges, focus on efficiencies, put productivity initiatives, and so on is a really good red flag. Are you starting to focus more on temporary fixes? Are you patching equipment? Are you stretching the life of equipment rather than investing or permanent solutions? Are you may be reacting to things rather than predicting and planning ahead? Now, organizations do lots of safety-related activities, and previous podcasts have talked about safety work and the work of safety. But if organizations start to see the completion of safety activities as being more important as to whether they’re effective, that’s quite often a big warning sign as well.

Companies are doing risk assessments, investigations, audits, and writing a safety case if that applies to your industry. And if the completion of that, if getting that done is more important than using it as a learning exercise and then whether it’s effective, that’s also a bit of a trigger for the organization. So, there are these things you can go looking for. I think one of the biggest things for me is because there are lots of questions we could ask, is that if you assume that your assessment of these major risks is incorrect and go proactively seeking information to continuously revise your assessment, you’re more likely to pick up these issues. Whereas if you assume that everything’s okay until it isn’t, it is too late at that point. Organizations are getting better in their maturity in their approach to investigations. But that maturity hasn’t carried over to being proactive in looking for issues. We’re getting better and better investigations, but we don’t want to have incidents to investigate. In organizations, there are tools or techniques. There are ways you can go and proactively look in your organization to find these issues. The maturity of investigations just hasn’t translated over to proactively going and looking for things.

There are lots of reasons why that might be the case.

I think it’s an interesting point because I think if you’ve got… The other element that comes to mind is if you’ve got an incident that happened, it’s clear who owns the investigation. But who owns this proactive view? Because in some organizations, it could be an audit, but an audit is not always necessarily equipped to do it. I know that in one organization, an audit made an audit in safety, and their focus in terms of driving safety improvement was to find ways to get employees back to the office faster, which has no impact on safety. But from a financial standpoint, if you don’t have expertise in what safety means, that might sound like a viable solution to reduce a rate, right? It could be your safety organization, but that safety organization needs to have the right visibility. It could be some form of a red team that’s constantly looking for challenging pieces. What have you seen be most effective in terms of where this resides and the practice around kicking the tire? Is that what you’ve got?

I think part of the issue there that I alluded to earlier on, Eric, is that I just don’t think this is a formal role within organizations. The departments that you mentioned quite often don’t have the expertise, experience, or time to be able to go and look for these issues proactively. So, the audits, investigations, they’re all quite constrained in their agenda, and so on. So, I don’t think there is a good example that I know of a function in an organization that is proactively going and looking at these areas. You do have risk committees and all these audit committees, whether or not you’re looking in the financial sector or whether or not you’re looking in oil and gas. I think there are pieces of the puzzle held by different people within an organization that can contribute to this review that we’re talking about. But I don’t think there’s really good practice out there of how that’s been pulled together into a cohesive, proactive, challenging go look to see whether or not we have any of these issues, particularly when you’re trying to learn from other industries. So if there’s been a big incident in one industry and there’s a big report that’s come out, and there are lessons and recommendations in that, organizations in that industry might look at that and might go and challenge themselves.

But that’s relatively short-lived, I think. If you ask people in organizations, what are the main failures in Piper Alpha? What were the main failures of Bearings Bank? What are the main failures in the shuttle incidents? A lot of people, including safety people, just can’t tell you what those organizational learnings would be. So not only are they not going looking for these things, but quite often, that experience, that understanding is just not available, Eric. But I think it’s a big gap. I think there’s a role for human factors, people, and systems people to be able to fulfill that role. But it’s very difficult for an organization to fund a position whose role it is, is to go looking for things that may or may not happen or that might be very unlikely to happen. In these times, it’s quite challenging to resource that position in an organization.

A couple of things that come to mind because I’ve seen some organizations do quite well at learning through case studies of others. So as a senior leadership team looking at something like the 737 MAX and what transpired around the box, looking at the Challenger, looking at Texas City, or looking at Deepwater Horizon, and using these as case studies to say, how could this happen here? And driving that reflection because then you’re starting to force this learning out of the industry and push that it could potentially happen here. And the other piece I’ve seen, and I think this is a… You talked about the human factors piece, I’ve seen some organizations that proactively, or maybe it’s every few years, run a safety culture assessment as an example. Now, my challenge with a lot of safety culture assessments is that people will do a survey which will give you no insights into what you’re talking about. But when I’m thinking about a robust one, you’re looking at surveying and speaking to a lot of employees to look about what could go wrong. And you also do a review of system factors. You look at a lot of the practices, the processes, the changes, the things that have occurred over the past few years.

So essentially, you’re kicking the tires on a regular basis at the organization. But what I’m talking about is it’s closer to really kicking the tires, but looking at the system components as well, even though the analysis, because the survey won’t be good enough.

I think you’re right. Organizations are doing surveys; they’re running focus groups. Some leaders will be doing walk-arounds. They’re going to facilities and talk to their staff. If prepared for that, that can be really, really helpful. They’re if you prepare them in terms of what they should ask, that can work quite well. I think these are all activities, and these are all tools that we have available, but I don’t think typically they are aimed at trying to pull out these deeper organizational issues, or maybe they’re not. The different sources of information maybe are not combined to give that overall view. Occasionally, organizations will get an independent organization in to do that review for them, which can be quite interesting. But again, that takes you back to the issue of you having to learn from those recommendations as well. And we have seen quite a few cases where independent contractors who’ve been asked to come in and review an organization quite often temper their findings because they want to get continual employment from that company. And we’ve seen that in some of the major financial events. But Bearings Bank is a good example where the auditors did not see issues, or when they saw issues, were not communicating them to the board because they didn’t want to alert the board to some of the issues that were there, which contributed to the demise of the bank.

So, there were lots of barriers and structural issues that might prevent some of the tools you suggested from working really effectively. But there are tools out there that can be used. We’re making general comments about what we’re seeing in the industry. It’s not to say that there are some organizations that are doing this well. I think it’d be really good to unpack those lessons in learning and communicate those more widely because there are pockets of good practice. I’m not saying no one’s doing anything at all here. There are pockets out there. We need to understand what they are, what is effective, and help to share those more widely for other organizations that maybe are not doing this proactively.

That’s often the tricky part because once something goes wrong, it makes front page news. The 37 MAX makes front page news, multiple investigations, lots of insights, lots of learnings. But does that mean that Airbus, on the other hand, that hasn’t had such a failure, is doing all of this proactively, you don’t necessarily know because they’re generally quieter about it. So, it could actually just be pure luck or actually good practices. And that’s the tricky part.

It could, but it could also be… If you look at an organization that’s had a few incidents or a couple of disasters, people might think, oh, well, actually X, Y, and Z is a bad company. It’s because of them. It’s the fundamental attribution error. If someone is driving poorly, you think it’s because they’re a bad driver. Whereas if you do something, if you cut someone up and so on, then you think, well, there’s all these other reasons why I did that. So, we tend to attribute failures to people because it’s an issue with them not thinking about all the contextual factors that influence behavior. So maybe that fundamental attribution error is something that’s important when we’re looking at these disasters because it’s easy to say, well, they’re just a bad company, and that won’t happen to us. We’re different. We employ different people. We’ve got all these processes and systems, and it won’t happen to us. Risk blindness is an issue for us as well.

I think if you touch briefly on Bearings Bank, the same symptoms that happen in Bearings Bank would probably have happened in many other locations because it’s not that hard to have a rogue trader. The difference there was the size of that rogue trader, but they’re present everywhere. Nab in Australia had three rogue traders on the FX side roughly around the same time. And there are lots of other examples that don’t get reported or get reported on the hundreds page of the newspaper if you really seek to look at them because it’s never a cause for success, but they happen a lot more often than we think.

I think they do. I think you’re right that we pick these examples, and we talk about these big disasters, partly because there’s so much information available on them. And it does become a little bit unfair that we keep going back to the same disasters, but they’re the ones on which we have much information. They’re the ones who’ve been investigated to the end of the degree. But you’re right, there are lots of other failures going on. Not all of them become so high profile. But we do know that lots of other organizations maybe have similar events, but they just, like you say, they don’t make the press for whatever reason, and they don’t become case studies on training courses for the next 30 years. But you’re right. You can pick Bearings Bank, and there would have been several of the banks with the same issues at the same time because they had the same processes or didn’t have those processes in place as Bearings Bank, but it just didn’t play out in the same way. As you know, maybe they had a huge loss, but it wasn’t enough to destroy the bank, and therefore it’s less visible to everybody else.

But you’re right, we’re picking a few case studies here because these are the ones, we have detail on. But it’s not to say this isn’t occurring much more widely than that.

So, Martin, thank you very much for joining me. I think a really interesting series of topics, the link that a lot of organizations relation feels for the same reasons. I think what’s really big takeaway is how do we learn better from investigations and then how do we learn proactively before anything ever occurs? How do we have that questioning attitude on an ongoing basis because it’s too easy to close your eyes and something and think, No, it’s okay? We’re okay. And really, how do you drive that questioning attitude within the business? So, Martin, these are really interesting topics. Obviously, your website, human factors101.com is an excellent source for insights. Is that the best way if somebody wants to reach you to get more insights?

Yes, certainly. I write quite a lot on that website, so you can go there and have a look. There’s a lot more information on there, or you can follow me on LinkedIn. If you search for Human Factors 101, you’ll find me there on LinkedIn. Please get in touch.

Excellent.

Thank you for listening to the Safety Guru on C-suite Radio. Leave a legacy, distinguish yourself from the pack, grow your success, capture the hearts and minds of your teams, and elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Martin Anderson has 30 years of experience in addressing human performance issues in complex organizations. Before joining an oil and gas company in Australia as Manager of Human Factors, he played a key role in developing human factors within the UK Health & Safety Executive (HSE), leading interventions on over 150 of the UK’s most complex major hazard facilities, both onshore and offshore. He has particular interests in organizational failures, safety leadership, and investigations. Martin has contributed to the strategic direction of international associations and co-authored international guidance on a range of human factors topics.

For more information: www.humanfactors101.com.

STAY CONNECTED

RELATED EPISODE

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo

Diving Deep: Navigating Organizational Learning through Storytelling with Gareth Lock

Deep Dive: Navigating Organizational Learning through Storytelling

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

Dive into another captivating conversation with us as Gareth Lock returns to The Safety Guru! Tune in as Gareth dives deep into navigating organizational learning through storytelling and discusses creating an environment of shared trust to encourage vulnerable and productive structured debriefs. Gareth’s profound insights and compelling examples will unveil the hidden layers of organizational growth. Ensure you don’t miss this insightful episode!

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost, for the C suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy’s success story begins now.

Hi, and welcome to The Safety Guru. Today I’m very excited to have back on our show Gareth Lock from the human diver. He’s an author. He’s brought a lot of experience from his 25 years in the Royal Air Force, to oil and gas, to many different industries, including diving. But my favorite is his branding around Counter-errorism. So, Gareth, welcome back to the show. Tell me a little bit first about Counter-errorism and your journey into safety and diving.

Eric, thanks very much for inviting me back in. As we know from the last time, there’s just so much to talk about this stuff, and I’m really quite passionate about sharing my knowledge and that journey that’s there. So, the whole piece about Counter-errorism in diving is just recognizing that we’re all fallible. My first idea about the diving business was the fallible diver. And people are like, That’s really negative. We know that humans are fallible, so why not make it a human diver? It’s like, Yeah, okay. And so, it’s both sides of what I would say that the bow tie that some of your listeners might know about is the prevention piece and then the mitigation afterward and recognizing that human error is normal, the first principle of human and organizational performance. I’ve got a really broad experience and operational background in aviation, research and development, flight trials, and procurement systems engineering. Left the Air Force in February ’15, set up my own business, and worked in oil and gas and health care and software teams. But my passion is really about trying to bring this stuff into the, predominantly the sports diving space, but now starting to work with military and scientific and commercial dive teams as well because people are people.

We’re all wired the same way, and we all behave broadly the same way. So, the knowledge is easily transportable. As long as you can have an open mind and say, you know what, that’s the context and the behaviors that lead to error outcomes, let’s see how we can bridge that into whatever space that I’m working in.

Excellent. And then today, a topic we’re going to touch on is organizational learning, something that is a very, very powerful and important concept that is really at the crux of safety, but more specifically around the power of storytelling when it pertains to learning. So, tell me a little bit about some of the work you’ve done around learning and listening to stories.

Yeah. So, one of the challenges in any environment is getting lessons to be transferred from one person to another. And the difference as well between lessons identified and lessons learned. People will experience something, they’ve gone wrong. They then need to take a little bit time to reflect and unpack what’s just happened. And there’s almost an altruistic need to share that story beyond yourself. Organizations or domains mandate or regulate reporting. So, aviation, there is an obligation that said, you had an event, you are to report. Now, actually, would it be nice if we could actually get people to share those stories voluntarily? They get that out there. And for that to happen, we’ve got to have both a psychologically safe environment, so we know that we can make those mistakes, but also, we’ve got to have a just culture that recognizes that we’re all fallible. And there is this gray line that sits between acceptable and unacceptable behavior. So, in the diving space, where my real interest in human factors and diving came from in 2005, where I had a near miss. Diving had a close call. I recovered from the situation. I got back to the UK, and I said, well, how do I report this?

Because that was my military aviation background, I had a near miss. Let’s share it. I found it really difficult to do that. So, since 2005 and now, really, it’s been about trying to create an environment where people can share stories and tell stories. I’m doing a Masters’s degree at Lund University, and one of the things that I’m looking at there is where people share stories. What are the barriers? What are the enablers? Who will they share with? Why won’t they share? And so, as I’ve gone through the literature, there is a couple of reasons. Organizations would like stories to be shared, and incident stories to be shared, because they believe that they, as an organization, can learn and improve. But for that to happen, the person who’s been involved in the story has to have some value to do that. Now, that value could be internal, so we unpack it. We got a cathartic approach to sit there and go, Wow, okay, that was close. What happened? What was the context? What led to that? Because actually, I don’t want that to happen again. But that’s potentially counter to what an organization wants, where they’re looking at much bigger things, or often they’re counting stories, and they’re not actually listening or reading the narratives that are there.

And so, there are two conflicts between storytelling following incidents. And that work from Santa in 2008 just looked at actually frontline railway engineers, operators, and trackside engineers. They tell stories to keep themselves and their buddies safe. My research in the diving space has shown that people share stories in a close, trusted group because they don’t want it to go further. Even though organizations talk about having psychological safety or a just culture in place, there’s often a fear that people will be ridiculous for being stupid. And if we can’t recognize and can’t accept fallibility, then the stories that get shared are not complete. So, it’s a huge opportunity, but we’ve got to create almost a theater to be able to tell those stories.

That is a very interesting point. And I know when you talk about stories, there was some research I was reading recently from Harvard around retention, and we retain stories considerably better than statistics. Difference at the end of the day in terms of what you do remember to the tune of 33 % versus 73 % of what you’re doing your members. So substantial differences. So how do you create that environment? How do you create this setting? So, what you describe in diving, to me, sounds like a group of buddies together, sharing maybe after work. And so, it’s more social learning, but it’s not necessarily embedded in the organization.

Absolutely. So how do you do it? You create an environment where people can share, where you have a structure of a debrief. So, in some of the original work from Gary Klein with Firefighters, how do they make decisions in uncertain environments? Time pressure, incomplete information. And what he noticed was that they would finish their shift, and they’d clean up their gear, and then they’d go and grab a brew, and they would talk about what they heard, what they smelled, what they felt, what was going through their mind. And that was as a team. And so, what was happening is they were sharing and creating shared mental models within their teams. And that then helped them make decisions in uncertainty. And it helps pass on tacit knowledge. So, the environment is critical. There has to be a level of trust. And you’ve got to have a norm of doing a debrief. And that’s what I’ve been trying to bring into the diving space, having a structure for a debrief because often people don’t know how to tell a story. And that’s, again, what’s come out of my research is that novice divers, especially, they’re lacking in two things.

One is they don’t know how to tell a learning story to get a point to cross. And the other thing is actually they often don’t know what they don’t know. So, it’s that bit that they don’t know they’ve had a near miss because they have got more concept of what right, wrong, good, bad looks like. And as a consequence, they’re not even looking at where things are. When we get to, I’m going to say, the more mature area of the diving space, we talk about instructors. Now we’ve got credibility, we’ve got the reputation, we’ve got litigation involved. And in that sense, instructors won’t tell their near-miss stories because there’s this fear of, oh, look, there’s an important instructor. Hang on a minute. I’m supposed to be doing some training with him, and he’s talking about mistakes that have happened. It’s like, Yeah, they’re human, too. That’s no different than surgeons. The society holds surgeons on a pedestal of excellence. Police officers operating in dynamic, uncertain environments. It’s really difficult to tell a multi-actor truthful story because people will be able to play the news clips back or the body cam stuff back and go, hey, look, you missed that, and you missed that because they don’t understand human fallibility.

So, this bit, how do you create an environment? It’s leaders, peers, role models that and you can start in small groups and build shared trust or psychological safety. But for a start, you’ve got to know where something has gone wrong. And I recently wrote about near misses, were you lucky, or were you good? But often, near misses are treated as successes rather than failures because we got a good outcome, even though we were really close. And so, we just move on, pat in the back, off you go. It takes a very different mindset to sit there and go and ask that question, were we lucky, or were we good? Oh, yeah, we were good. All right. What do we do that we can replicate the next time and the time after that? Oh, yeah. Actually, we were pretty lucky then. All right. So, let’s look at what we missed and build those stories and then share them as it goes. And the problem with stories is that they get modified and changed because of the way that our memory works. We embellish certain factors, and we hide other ones because we don’t have that side of psychological safety, that security to show our vulnerabilities.

Very interesting. When you mentioned you talk about storytelling debriefing, a scenario that comes to mind is the approach that the US Army has used around after-action reviews, which are originally intended to be essentially storytelling from multiple different perspectives to walk through. What do we go through, whether or not there was something good or bad as an outcome, but really trying to look at what we plan and where was it different than what we expected it to look like? Is that something similar what you’re describing?

Yeah, totally. This needs to get into the habit of running a debrief. So often, debriefs or after-action reviews are run when something has gone wrong. Now, if you don’t perceive that something has gone wrong, why are we running this debrief? And it just then loses its value, and people then lose get out of the habit of doing it. Whereas actually, if we frame the debrief and we can put something in the show notes, a link to a debriefing guide that I use, and it follows the word debrief. And so, the key learnings that are in there are internal learning. What did I do well and why? What do I need to improve on, and how am I going to do it? And the E is the external learning, the team. What did the team do well and why? And what did the team need to improve, and how are they going to do it? And the why and the how questions are the most important because we can make an observation about something that went well or we need to improve, but it takes a lot of thinking to say, why did that go well?

Or how are we going to make that improvement? And then the final part, the F of the debriefing framework, is about fixed files or follow-up. So, you’ve done an activity, you’ve briefed it, you planned it, you briefed it, you’ve done it, you’ve debriefed it. Now that you’ve identified some lessons, what are you going to do with them? And that’s the difference between lessons learned and lessons identified. Many organizations have got loads of lessons identified, but far fewer lessons learned. And the lessons you’ve learned are where you’ve looked at something, you’ve put something in place, and you’ve measured its improvement. Or actually, you realized that that intervention didn’t work, and so you’ve learned that that didn’t work. So, the difference between lessons learned and lessons identified is, did a change happen afterward? And that’s a huge piece.

It is because a lot of times, like you said, organizations learn the same thing over and over and over because the change is not embedded. It’s just something on a policy document that says thou shalt do it this way, which may or may not solve the problem or may or may not be operationalized.

Absolutely. And that takes strong leadership. I was recently involved in a major review, and the accountable individual, the duty holder for this, wouldn’t sign off the actions or the recommendations as being complete until they’d actually been completed and put in place. Because one of the parts of the review that we picked up was that there were recommendations made in previous reviews that never actually got fulfilled. And it was like, hang on a minute, these were not directly contributor entries towards the event, but they did recognize that hang on a minute, we’re not very good at learning here because we capture this stuff, and we don’t fix those things that are faulty or failed.

This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, free energize your BBS Program, enhance Supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit Us at propulo.com.

I love your storytelling approach to learning. How do you disseminate that across an organization so that the divers that get together, they can do that casually? How do you make sure that that same insight gets cascaded to groups that can’t be there physically?

So as a direct example, what I put together is a documentary called If Only. And that looks at a diving fatality through the lens of human factors and just culture. And I’ll send you the link for that as a human diver or slash, If only. And I was really fortunate to get involved with the widow of the diver and the dive team, three members of this, three surviving members of the dive team. So, we flew out to Hawaii, and we had a face-to-camera work, we re-enacted it, and we shot about five and a half hours of video. And then that was reduced to 24, 25 for 25 minutes, and then I added some other stuff. And the editor said, Look, you’re going to have to make it shorter than 20 minutes. I’m like, what do you take out? I don’t know. So, I created this 34-minute documentary which has been downloaded thousands of times. And that then goes out. And I know that people in the non-diving space have looked at this and gone because the failures are multiple within the system. And often, it’s about psychological safety, decisions, inability to speak up, drift, about equipment not being set up correctly, which carry across many other domains as well.

So, to me, the ability to share engaging, emotional, sometimes really quite powerful stories to get across there. So that’s one way. The blogs that I write, I often start a blog with a story because people… When you open it up, and you go, what’s going to happen next? You started off with, and the diver was on so and so, and this, and you go, Right, what’s happening next? And you’ve got to put a hook in there, and then you’ve got to stitch the theory into the story so that it becomes a learning lesson, and they can relate to the individual. There is a really powerful bias of distancing through differencing, and this sits not just at an individual level but an organizational level as well, where we will look at somebody or some organization and go, They’re different to us. We wouldn’t make that mistake. And you sit there and go, yes, you would. From the diving side, I put together under pressure the book that I published, and there’s another one called Close Calls, which is a similar story. Mine’s got theory woven in and out. Close Calls is just stories from names across the industry.

And people like to read them. The hard part is, does it actually change people’s behavior? Because ultimately, that’s what we want to do, get people to think differently and understand the context in which they were. Not to turn around and say, I wouldn’t do that as an outcome because the outcome is too late. What we’re trying to do is spot the context developing and sit there and go, oh, I recognize this, and I can see where the trajectory is. But that’s really hard to get across. And even when you’ve got known stories, so there’s a paper I read recently from Dylan and Tinsley, or might be just big Dylan on their own, talking about using lessons from Challenger to get the ideas across. And what they did was they created a scenario of an aircraft that needed to fly some spares to a remote location, but the temperature was low, and the oil seals might leak on the engines. And if the oil seals broke, they’d need to shut down the engine. They’d probably ditch, and then the crew might not survive the ditching. And what was really interesting was that even though the story was told as if it was Challenger, the people didn’t recognize it was Challenger.

And still, about 70 odd % of people went, Yeah, we’ll launch. Off you go. So even when you’re given a narrative, we often can’t make the connection because it’s just the way our brains are wired, unfortunately. So, it has to be really visceral. It has to be that’s me, and I would do that.

Interesting. And I’ve seen this many times in organizations. When you talk about small group sharing their mistakes, part of it is there is camaraderie, people know each other. Is there a way that you seem to extend this so that people don’t say that won’t happen to me? I wouldn’t make that silly mistake. To really overcome that element, to recognize that, yes, as humans, we’re all bound to make those mistakes.

So, I’d say probably US Forest Service with their lessons learned center that they’ve got. And I think the important bit is to get away from the individual’s this erroneous performance and look at the context and the error-producing conditions which are there. And that’s why I was referring to earlier understanding what goes into a good learning story is understanding what sets somebody up for failure in this scenario they’re in. Because, by definition, if we knew what the outcome of the event would be, we would have stopped it.

Sure.

So, this bit about, Right, think about all those bad things that are going to happen. Yeah, well, how am I going to spot them? I don’t know the significance of those. So, what we have to do then is actually, what can we tell in terms of the situation developing that I will encounter? And then sit there and go, this is the system or the situation changing. Okay, that’s a flag. Not, I won’t make that mistake. It’s, I’m now in a situation where I’m more likely to make a mistake. Can I raise my game? Is this something that’s a flag that says, look-out.

Interesting. So, move it away from the area itself to the context of the situation that people are in because then you’re more likely to relate, saying, that set of circumstances could happen to me as well.

Yeah, totally. And so, aviation moved from cockpit resource management to crew resource management, now threat and error management. So, there’s this expectation that the aircrew is competent to do what they need to do. We don’t need to train them more and more to do that. The threat and error management situation are. I’m potentially going into a busy airfield. The wind is marginal. Do I set up the opposite runway, ILS or approach systems, or the other frequencies? The weather forecast has got thunderstorms in the area or whatever it is. It’s a potentially confusing runway. Let’s think about how we set ourselves up for success, not failure because generally, that’s about sharing stories where you know what, the situation got away from people. So, can we get ahead of things and provide that flag that says, Whoa, that’s enough? And in the majority of high-risk industries, we have something called stop work authority. My simplistic view is that often, that’s a stop by an organization to say, I’m going to give you a card. If you think it’s unsafe, then hold this card up and stop the job. But most people don’t know that it’s all going horribly wrong until it’s gone wrong.

And then the organization says, why didn’t you stop the job? Because you could see it was there. And there are a whole bunch of social, technical reasons why people find it hard to say stop because there are goals that are around there. So, if we can start to say, Let’s look at the conditions that are around us, then that’s actually easier to raise a flag.

Yeah. And also helps people understand where I am entering dangerous territory. Your example about maybe this confusing runway. There have been some runways where there’s been more than one flight that almost landed not on the runway but landed on another airplane that was taxing. But you know which airports those are. So, you could be on high alert if you know, okay, I’m approaching San Francisco is one of them, I believe, has come up a few times and say, okay, on this approach, here’s what I need to pay extra attention to.

Yes. And so, we’ve got a limited capacity to pay attention. So, in that bit that says, actually, here’s the high-threat situation. I’m now going to not quite ignore the other things, but I’m going to point my attention. And one of the things I try to get across in my training is we’ve got a limited capacity to pay attention. So, it’s not that people weren’t paying attention because often the response is, Pay more. We can’t pay more attention. What we can do is focus it somewhere else. So, what we’re trying to do is, what’s the threat that we’re encountering? And that comes from understanding the near misses that are out there and the context that’s encountered.

So, Rich, topic. To me, organizational learning is probably one of the most challenging parts of safety that we keep talking about. Hardest one to do. But I love your angle in terms of sharing stories, trying to learn on a regular, continuous basis, just so that people reflect and think through the stories. And then how do you disseminate those stories through scenarios on the context as opposed to the individual and the error that they made? I think these are very powerful concepts that hopefully help organizations move from learning the same thing over and over to learning and actually embedding that change. 

Totally. And what I would say from my experience as well as people are more likely to share a context-rich story than a closed narrative story which is focused on the individual. So, if you can get more context, more system if you can get multi actors in there, there’s a paper out there looking at when an incident report has got multiple narratives, then people are more likely to look at systems causes than a single narrative which is a synthesis by the investigator who will have their own perspective. And often, it’s about compliance, noncompliance. And so, people will look at that and say, here are the recommendations which are focused on fixing the person. Whereas actually, if you have multiple actors and you can hear the conflict and the different ideas, and when you’ve got six actors involved in an incident, expect six stories. It’s not because they’re lying; it’s because they’ve got different perceptions about what happened. So, if you’ve got the opportunity to share a multi-actor story, that’s the way to go about it.

So soon, we’ll be writing Hollywood scripts through those stories.

Well, we often have multiple actors in a story in a film.

But there’s some truth to the way you share stories because even in Hollywood, they say there are seven-story themes to every movie that’s sold across the board. Rags to Rich is an example. But it’s a narrative that we tend to listen to. The personas and everything else get us to associate with it and then remember that story.

Totally. And there’s a paper from Drew Ray which talks about the different safety stories and how you share them. Do you tell the outcome and then build it up on a different narrative? Do you tell one narrative where people jump to conclusions, and then then you tell the context-rich story, which then brings the learning point out? So, this goes back to what’s the purpose of the story and who’s the audience you’re trying to tell the story to, and the learning point you’re trying to get across.

Excellent. Well, Gareth, thank you very much for coming back to our show. Appreciate you sharing some of your thoughts about learning, organizational learning, and storytelling. I think it’s very powerful. Sayers of ideas to take forward. Thank you.

Brilliant. Thank you very much, Eric. I love being on it again. Thank you.

Thank you for listening to the Safety Guru on C-suite Radio. Leave a legacy. Distinguish yourself from the pack and grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Gareth Lock is the founder of The Human Diver, an organization set up to deliver education and research into the role and benefit of applying human factors, non-technical skills, psychological safety, and ‘just culture’ in sports, military, and scientific diving. He has published the book ‘Under Pressure’ and produced the documentary ‘If Only…,’ both focused on improving diving safety and performance by looking at incidents through the lens of human factors. While primarily focused on diving, he also works in other high-risk, high-uncertainty domains such as healthcare, oil & gas, maritime, and software. He is currently undertaking an MSc in HF and System Safety at Lund University where he is looking at the power (and limitations) of storytelling to improve learning.

For more information: https://www.thehumandiver.com/

The Debrief Guide: www.thehumandiver.com/debrief

If Only: www.thehumandiver.com/ifonly

Sanne (Santa in the transcript) – Incident reporting or storytelling? Competing schemes in a safety-critical and hazardous work setting – http://dx.doi.org/10.1016/j.ssci.2007.06.024

Klein and firefighters – Naturalistic Decision Making http://journals.sagepub.com/doi/10.1518/001872008X288385

STAY CONNECTED

RELATED EPISODE

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo