Select Page

From the Mines to the Heart: Advocating for Safer Tomorrows with Helen Fitzroy

From the Mines to the Heart: Advocating for Safer Tomorrows



As the holiday season reminds us of what truly matters, we are honored to feature Helen Fitzroy on The Safety Guru as she shares her moving message that will carry us through the holidays and beyond. Her husband, Steve, experienced a workplace fatality in an underground mining incident in 1991. Her story isn’t just one of personal tragedy but a call to action for all of us. Tune in as Helen advocates for a safer tomorrow with her unwavering commitment to safety, dedicated to ensuring that no other family has to endure what she went through.


Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost, for the C-suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski, a globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi, and welcome to The Safety Guru. Today, I’m very excited to have you with me Helen Fitzroy. She’s a safety advocate, an author, and a writer, as well as a miners’ widow. I’m really happy to have you join us, Helen. You’ve got an incredible story to share regarding the positive contribution you’ve made to safety. But I think maybe let’s start first with Steve’s story.

Thanks very much, Eric, and thanks for having me on the show. Thirty-two years ago, my husband, Steve, went to work underground. He didn’t come home. I was left with three little kids under the age of seven and basically stuck with a new title, Widow, which didn’t impress me very much. Then the whole journey began of how do I traverse this? A couple of years before Steve’s death, one of his really good mates, who was also a very experienced miner who worked at the same mine, refused to go to work on this particular shift because the particular supervisor had asked him to work under unsupported ground, and he refused to. They sent a young, inexperienced 21-year-old in there, and tragically, he was killed. Just five months before Steve’s death, another very good mate of his, who was also an experienced minor working at the same mine, fell down a ladderway underground and was seriously injured. He had compound fractures in both of his legs, along with some external injuries. He had to get elected out to Perth by the flying doctor, had two little kids under the age of three, and so he spent 12 months up in Perth having intensive rehab.

Wow. Leading up to all of that, there were some concerns, and Steve used to often come home and talk to me about his concerns. I suggested, how about we take a couple of weeks off, and we can go up to Perth to see how your mate’s going? We did. We shot off to Perth, had a couple of weeks, and caught up with his mate. He was back at work a week, exactly a week when he was killed. That’s when it all started.

He was raising concerns with you. He saw those trends, and I think this is the part in often cases like that. There are signal signs. How was the organization receiving this feedback? Because they’ve had a fatality, serious injuries, a very short period of time.

It’s probably worth also mentioning that he was considered—and we’re talking we were living in a mining town. It wasn’t a very big mining town. There was a whole… In the gold fields, there are a whole lot of little mining towns that probably had a population of 34,000 people max. He was considered in that particular mining town, probably the most experienced, the best, and the most safety-conscious minor. He consistently would come home and say to me because he would go and voice his concerns to the management, and they would say to him, What’s the matter, Fitzy? Aren’t you earning enough? They basically just deride him. There was no… It was a joke. That was frustrating. My advocacy is really based on if there’d been somebody out there that was going to stand up and assert themselves and tell people a story about this is what can happen. Perhaps you might have had a second thought about actually not going there anymore, going somewhere else. The thing is, in terms of when you asked me about management and what their views were, I was talking to a mine inspector a few years later. He had come out from South Africa, and he’d worked in the adjoining town to where we were, about 40 minutes down the road, another little mining town about the same size.

He said that when he arrived there, and he had extensive experience in South Africa, even though he was an English guy, he said that the company, which was the same company that was managing the mine that Steve worked in, would budget for seven fatalities a year. Seven? Just there, seven. My goodness. He said they generally achieved their target. That’s horrible. I know back then; fatalities were just a normal part of business doing business. It’s cheaper, really, to kill somebody at work than it is to permanently disable them because you know what you’re dealing with. It’s cut and dried. Whereas a permanent disability it could be, well, how long is this going to go on for what other… There’s that uncertainty about what the cost may end up being. Yeah, that was the culture then.

One of the things you advocate about safety is to remember the people we come home for. Tell me a little bit more about some of the messages you share.

Well, since Steve’s death in Australia, there’s been 506 more fatalities. 506 fatalities in mining. That’s 471 kids who’ve lost their dad, 100 widows. Now, that doesn’t take into account the parents, the siblings, the mates. It also doesn’t take into account those who’ve lost their lives through a work-related illness or disease. I think when I’ve looked at the stats for Canada, you’re not far behind. I think I tallied, and it may not be totally correct because I don’t have all the stats, but I think it was about 478 in the same time frame in Canada. That’s disgusting. After Steve’s death, it was probably around about ten years later when I first started traveling out to sites, talking to people. I was inundated with all of these phone calls and messages. On average data, I was getting about six a week, people asking questions. How long will that report take? How do I find this out? From families and workers. I started to make contact with agencies to say, well, what do you offer? How can you help? I can see the data and the pamphlets you’ve written and things, but they’re all doubling up, or the information is wrong.

I met with a lot of the regulatory bodies and agencies to try and encourage them to establish a support network for families following a situation like this. They didn’t think it was necessary, so I did it myself in the end, and I left a not-for-profit with the backing of a fairly big mining company here, BHP, with their support. But my conditions were that it had to be totally independent of any particular company, political party, or union. It couldn’t have any vested interests. That was established in 2010, and it’s still going strong. Yeah, it’s still going strong. I’m not as involved as I once was anymore. They’re doing fine without me. Yeah, it’s good to know that there’s now somewhere people can go to seek assistance. It might be financial, it might be just emotional, it might be a whole range of things, practical assistance to help them through that process because there was nothing when Steve was killed.

Absolutely nothing. The company didn’t step up either on that.

No, they didn’t. That wasn’t unusual back then. I know that even my husband was a member of the local union. They were disinformed as well. Everybody’s performance was inadequate. I think things have come a long way since then, though, and I think they’re a lot more tuned in now that people expect more. Yeah, we had to bundle our way through. I had to find my way through by myself, really.

In an environment where they were budgeting seven fatalities, it was.

A process. It was something that I accepted. That’s horrible.

Then to put up with the legal, five-year legal battle, where there was just—and I’m not just blaming the company, I’m talking about the insurers and the lawyers and just constantly delaying and ridiculous ploys that they would use to try and deter. Go away. Just go away, will you? I was determined not to do that. I was determined to stick to it. I felt I owed Steve that to get to the bottom of it, and eventually, I did. But it was a long battle, and that still happens today. I’m still in touch with many families who are still going through that process. It’s a struggle.

You share the message with the people that you speak to, but you also have a message for leaders.

Yeah, I do.

Tell me a little bit about your message for leaders in this case.

Well, I understand I appreciate, as a leader, that there’s a lot of significant data that crosses their desk on a daily basis, whether it’s budget issues, whether it’s related to production targets, whether it’s related to deadlines and staffing. I accept the significance and importance of all that information. But the point that I’d like to make is that in acknowledging the importance of all of that for a viable business, that has to happen. But behind every single decision that they’re making, whatever it may be, there’s generally a human being attached that may or may not be impacted in a negative way by that information. I would implore them all to consider carefully every decision that they make to ensure that there aren’t going to be any unforeseen circumstances. it won’t be them, but somebody else might be impacted negatively by the decision that they make.

What does that translate? Ultimately, I agree it’s understanding that there’s a person behind the paper, the decision. The further away you are from the decision-making, from the sight, from the work, the easier it is to separate yourself and your actions. In an event, it becomes very easy to disassociate yourself because you don’t want to have to carry the responsibility. You push that burden to somebody else.

Absolutely. You’ve nailed it because that’s exactly what happens. If you’re sitting in an ivory chair in the middle of the CPD somewhere and you’re making decisions and you’re looking at that promotion that may come next month, if you produce the goods, of course, the pressure is going to be on there for you to perform and to do things that perhaps it might be impressive at a board level, but at the front line, at the coal face, there could be somebody who’s going to be impacted by that decision that you haven’t considered. I suppose it’s just about being a little bit more aware of how that decision that you make while you’re sitting in the comfort of your cushy office might impact somebody down the track. It may not always be that easy to determine, particularly when you’re looking at production targets and things like that, where workers are often rewarded if they reach particular targets. They’re given bonuses and things. What happens? If you’re going to encourage a bonus mentality, you’re going to encourage people to take risks. You’re going to encourage them to do maybe things that they otherwise wouldn’t. Those sorts of cultural norms, I think, can create issues as well.

Absolutely. When you mentioned this, I had a guest on the podcast a few months back, and he talked about one: the complexity and safety is when you save a penny on every dollar, it probably won’t have a financial… It will have a financial consequence but probably won’t have a safety impact.

But, that second penny, probably not. Then there’s a temptation of just, Well, what about the third, the fourth, the fifth penny? But at some point, something breaks, and you never really know which penny it was, but It’s really understanding the chain of causality. Also, the element he brought up was that the closer you are, and you have proximity to the site to the people that are working, the more you’re making better decisions, the more you’re disconnected, staying in an Ivy tower, no pictures of the team members that are doing the work, never been there, it becomes a transactional balance sheet decision.

Yeah. I think also with that comes an added… It can be quite problematic for contractors. You can have the client and engage contractors to come in and do a lot of the work for them. Most of the time, when they do that, it is the coal-faced, front-line, hard stuff that they’re doing. They have to ensure that they meet their budget constraints. They also have to make sure because they want the tender. They want the next tender as well. The pressure is always on them, probably more so than the client’s employees, to perform and produce the goods because otherwise, there’ll be no tender.

This episode of The Safety Guru Podcast is brought to you by Propulo Consulting. The leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance or introduce human performance capabilities. Re-energize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions. Propulo has you covered. Visit us at

We talked about the paper. Every paper matters. You touched on before that items were raised. There were signs. Organizations need to be looking for those signs or symptoms and not say, squeaky wheel, but trying to understand. Sometimes, you may have somebody who is a squeaky wheel who complains about everything. Oh, yeah, they’re at the – But how do I really see those? But there are a lot of others that are not complaining, and that surface an issue. Or even the person who complains, there will sometimes be some real legitimate pieces. How can people help triage through all of this to take action? Because this was clearly a case where there were enough signs and symptoms to say actions were needed.

Well, I think it comes back to good communication and leadership. Good communication and leadership means trust and respect. In the hundreds of sites I’ve been to over the years, I could virtually walk into a muster room where they’re doing their training or be in there and watch the crews walk in and predict who the good leaders are just by the body language of the crew as they walk in. That says a lot to me. It’s played out numerous times where you can just tell by the way the guys are communicating with one another, the way they’re walking, the way they’re… You can read the play. I think if we have good, supportive, respectful leaders who can communicate with every crew member, no matter what their little idiosyncrasies are, then you’re going to have the morale is going to be good. If morale is good, you’re going to be productive and safe. To me, it all comes down to selecting carefully the leaders that you choose. Look, leadership starts at the top. They say, Fish stinks from the head down. If you haven’t got a supportive leadership team at the top, you’re never going to get it at the.

Ground level. But even if you have a supportive leadership team at the top, it doesn’t always translate to the ground level because it has to be embedded in the selection process. It has got to be that if I find that you’re not showing up this way, I do something about it, and I act on it fast because we have that dialog on a regular basis in terms of who is a good safety leader, that I act on it.

Yeah, and you’re dead right. I’ve been to numerous sites that have been run by the same company, and the culture is different for everyone. It’s not just the top team. It comes down to who’s running the show here and what attitude they have towards safety, and to our workers, and the morale of our team. What do they rate as significant to our guys on site? It was really mind-boggling to me that I could go to five different sites, all run by the same company, yet the safety culture was different at every one of them.

I think it’s an important point you bring up because I often advocate that, yes, you may have one culture, but there can be a lot of subcultures that exist. Not wrapping your head around these subcultures can be really a blind spot. Because you may be 90% good, but you may have a bad side. I remember I had a couple of years back, somebody on this podcast who worked for an organization he acknowledged had a very positive safety culture. But he raised an issue. In his small location, which was a very small, remote rural area, it was a utility. When he raised a concern, which later proved to be a serious injury that happened, he was told both by the union leader and the local management, are you a man or a mouse? In other words, go do the thing, don’t complain, and literally, shortly thereafter, get seriously injured. The organization as a whole was good, but obviously, there were pockets of leadership in the union and management that shouldn’t have been there. I think where you’re bringing up is really this element of you got to know, and you got to act on those differences.

That’s hard. It’s one of those things that you’re probably getting inevitable that you’re going to get those pockets everywhere. There’s some that just slip through the hoop, and they’re out there, and they’re macho men who… I’ve seen them. I know they’re out there. You’d just like to think there’s someone a little bit higher than them. It’s going to pull them into gear every now and again. But it’s a sad reality.

I know when we first connected, you touched on a theme that is very near and dear to me, which is the difference between safety as a core value versus safety as a priority. There is a clear difference. Some speak of it as a priority. Some talk about it as a value. Tell me a little bit about what that means and the importance of that.

Well, it started to evolve way back when I first started traveling out to the site, and it didn’t seem to matter, particularly, I think, in the first couple of years. I went to every jurisdiction in Australia. It didn’t seem to matter where I went. In that first couple of years, somebody, usually within the management team or supervisor, would come up to me in conversation and say something along the lines of, we make safety our number one priority here. Now, with all due respect, and this is just my personal opinion, that’s just bullshit. Priorities always get shifted. If you make something a priority, you’ve given it a shelf life in my eyes. It can only be a priority until something more important comes along. That’s the nature of the world we now live in. That’s why it has to be a value. It has to be embedded, endemic, and intrinsic to every single thing that you do. You can’t just pick it off and on when you’ve got time, or when someone’s watching, or when you’ve got the resources. You take it home with you. It’s all the things in your life that you value. I think we need to encourage from the top down because we want to ensure that we have a genuine, consistent commitment from every single leader in the organization to ensure every single person on that site goes home safely.

Actions speak louder than words.

But I think it links back to what you shared before is if people are raising concerns, raising issues, if it’s a value and it’s really understood like that, then people wouldn’t close their eyes to it, neglect it, it’d be really core to understand it.

That’s right. That’s right. The quote that I came up with after that little encounter, after numerous encounters, was if safety were a core value in my workplace, there’d be no need to prioritize it. You can hear people say over and over and over again. I still hear it when I go out to the sights. Look, safety is our number one priority here. Well, look, I know you probably mean well, but just rethink that, will you? Because you have to be realistic, and you’ve got to do it a different way. It can’t be priorities inevitably get shifted, and so I’d prefer that they rephrase that.

But I think the consequences are much more than rephrasing. It’s also how people show up. Because I’ve seen it in organizations where it’s the number one priority, and then they have the strategic imperatives for the next five years, and safety is not on the chart, and then somebody raises their hands, say, shouldn’t safety be there? They’re like, Oh, right. Because it’s not a dialog at the C-suite, it’s not a value. It’s not something that people are evaluated on. It’s not reinforced day in and day out, and so it gets forgotten.

You’re right. One of the really interesting things that I’ve discovered over the years is I’ve noticed on the media online that when there’s a fatality, the company might come out. They’ll report that there’s been an incident, and tragically, somebody’s life has been taken, and we’re supporting the family, and we’re doing this. We’re supporting our colleagues, and whatever, then the last sentence will usually be the daily share price. Now to me, I have real issues with that being in the same article. Now, whether that’s the fault of the journalist who’s throwing it together or whatever, it seems to be a consistent pattern that I find quite offensive that you’re talking about the welfare of somebody who’s gone through a tragic experience or the loss of life, and then at the bottom, you’ve got the share price. The two don’t go together, in my view, and never will. Right.

The last topic I’d like to touch on is boom versus bust. Mining is probably more extreme than a lot of other industries. What’s the impact of boom versus bust in mining and safety?

Well, I guess back in the mid-2000s here in Australia, and I don’t know whether this was a global thing, but definitely in Australia, there was a boom. Every company is scrambling for more employees. They want to get that stuff out of the ground as quickly as possible. It got to the stage where they were employing people. One supervisor that I spoke to out on the side in the goldfield said to me, Basically, all you need to get a job in the mines now is you need to be standing vertically and breathing. That was how it was. He said that he had had a busload of young guys that he picked up from the airport, and one of them, he said, What’s your job? What are you coming out here to do? He said, oh, I’m going to drive a truck. This is an underground mine. Have you ever driven a truck? Have you ever been underground? He said, How the hell do I manage and supervise these young guys? That was the circumstance in the boom, and I saw it firsthand. Then, around 2015, there was a downturn. Actually, throughout that mid-2,000 boom period, in five years, we had 101 fatalities in the industry.

That indicates to me if you look at a graph, you can see the spike. Then back, moving on a decade, 2015, there was a downturn and people getting laid off. Other employees were expected to wear two and do the same job. The pressure was on in terms of we still need to get this stuff out of the ground, but we’re going to have to do it more economically without as many people. Then you start getting people taking shortcuts, people are their morale was low. The same old pattern comes back again, increasing incidents and increasing fatalities as well. It’d be just really nice if they could find an even keel instead of… But I don’t think that’s how the industry works.

It’s hard because there are definitely peaks and valleys, and mining is probably one of those top peaks and valleys industries. Definitely, yeah. The element, though, I have definitely seen in mining where in valleys where the economy is not strong, sites get shut down, and locations get shut down. I’ve seen it where the narrative started changing that safety is not physical safety but it’s putting food on my family’s table. That becomes very dangerous because they associate the mines that weren’t as successful and that were shut down were maybe safer minds but less productive minds. Then they start rewiring that safety actually gets in the way of my personal safety, which is putting food on my family’s table. That becomes very dangerous. But I’ve also seen other organizations that were… Kola was an example where there was an end date, either a mine site or a generation site. Things continued very well because the leader was really focusing until the last day; we will be safe. Part of it is also the choice of knowing that even if we won’t be here forever, how do I lead in that context?

Yeah, that’s right. That comes back to leadership and the culture that they established and set, and that everybody feels comfortable to have buy-in. Because if you don’t get buy-in from the employees that are there on-site, you can Sprout what you like. But if they don’t feel that they can trust or believe what you’re saying, that’s where actions speak louder than words. If you’re demonstrating that that’s your commitment, then you will get by. I think too often, the guys on site roll their eyes, and here we go again. That tells you a lot about the culture that’s established there. I think what you were demonstrating by your example is what every company should aspire to.

There are ways to hire maybe in advance of a boom—you can’t perfectly time it—but you’re not desperate at the last minute to take anybody. There are ways to recruit higher-quality talent. There are ways to invest in better training if you know there are going to be gaps because of who you’re able to get. There are mitigations to a lot of these elements, but it’s just being aware of it and recognizing it because, in both cases, it can have very negative effects.

Yeah, for sure. The other issue, too, is that if you’re putting… I refer back to the boom here in the mid-2000s, where you could walk off the street and get a job. A lot of these were young kids, really, late teens, early 20s who, Yeah, I want to get in there. I want to get some good, serious money. I want to. But if something happens to them, there’s no return other than for the families. Mom and Dad at home. They can’t sue the company. You can have a common law claim, but there’s no payment made to families or whatever unless you’re a dependent. For young, single guys who don’t have any dependents, which most of them don’t, there’s no comeback. There’s no comeback. It’s advantageous to employ young, single guys or girls because there’s no litigation forthcoming other than from the regulator, who might decide that your practices weren’t any good. But as far as the loved ones, nothing. There have been numerous examples of families that I’ve spoken to. One instance was a family from Brooklyn Hill, and the dad worked 30 years in the mines underground, and his son was killed in WA.

The company were fined $50,000. Now, this is a big Australian company that everybody globally has heard of. They were disgusted and really totally offended that their son’s life was worth $50,000. Now, they didn’t get that money. That just went into the coffers for the state regulator. But it’s an insult to think that with all of these issues that were found to be so inadequate, where he was working, that they were fined $50,000. There are numerous similar stories to that. Every life is valuable.

Absolutely. Ellen, thank you so much for joining me on the show today. Thank you for your advocacy for safety, but also for the families of those that lose a loved one. If somebody wants to get in touch with you, what’s the best way to do that?

Well, they can email me. I have a website too, so either email me or go to my website and send me a message going to be great. Excellent. Thank you so much, Helen.

Thank you. Cheers. Take care. Bye.

Thank you. Bye.

Thank you for listening to The Safety Guru on C-suite radio. Leave a legacy. Distinguish yourself from the path. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes:

C-Suite Radio:

Powered By Propulo Consulting:

Eric Michrowski:


Helen Fitzroy’s passion for workplace safety commenced following the death of her husband, Steve, in an underground mining accident in Norseman, WA, in 1991. The accident left Helena a widow in her early thirties with three young children to raise. At the time of Steve’s death, mining fatalities were largely ‘normalized’ by companies and government regulators. The deaths were considered an inherent risk of the industry, with virtually no support offered to families to enable them to move forward with their lives.

One of Helen’s coping strategies was writing. She wrote to her husband, Steve, but also to herself and her children leading to the publishing of her first book some years later, “Just a Number.”

“Just a Number” outlines her family’s journey in the five years following Steve’s death, as they traversed the quagmire of emotional, legal, and bureaucratic processes that constitute life for a bereaved family following a workplace death.

Since writing “Just a Number,” Helen has been traveling extensively across Australia as well as overseas campaigning for improved safety and better support for bereaved families. She also delivers safety-focused presentations to companies across all sectors, highlighting the importance of both parties’ commitment to safety at work.

Helen’s commitment and passion culminated in the establishment of Miners’ Promise in 2010. Miners’ Promise is a not-for-profit organization established to provide emotional and practical support to members and their families following a crisis event such as a death, illness, or serious accident.

Helen served as a Director on the Miners’ Promise Board for several years, including a number of years as Chairperson. A qualified grief counselor, Helen continues close association with the organization providing family support advisory services to members.

Helen is a recipient of a WA Local Hero of the Year Award, a category of the Australian of the Year awards. She continues to speak prolifically to corporations across all industry sectors and provides ongoing grief counseling to families coping with the loss of a loved one.

For more information:




Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at
Executive Safety Coaching_Propulo

The History of Safety: Where Yesterday Meets Tomorrow with Carsten Busch

The History of Safety: Where Yesterday Meets Tomorrow



Embark on an exciting journey with the renowned Safety Mythologist and Historian Carsten Busch, also known as the “Indiana Jones of Safety.” Join us as Carsten shares captivating stories with lessons from the history of safety that shape tomorrow’s strategies. Tune in now and be part of the adventure!


Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski, a globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi, and welcome to The Safety Group. Today, I’m very excited to have with me Carsten Bush. He’s a safety professional with a deep understanding of the history of safety, which is a big topic we’re going to talk about today. He’s incredibly interested in the development of knowledge when it comes to safety. Coming to us from Norway. He’s also being nicknamed the Indiana Jones of Safety. I have to ask you before we get started: how did you get that nickname?

Thanks, Eric, for inviting me. Well, fun question. It originated from a Dutch safety professional, who sadly died a couple of years ago, Roland Bakker. He used to work for Shell. And while I was studying at Lund University and was diving really deep into the early writings, safety writings from the early 20th century, and so on, I discovered many very interesting quotes. Either they were funny from today’s point of view, or that they were quite amazing that you would say, wow, this could be written in 2020. I know one today I would be surprised that they were quite ahead of their time already then. I shared those clips, and at some point, Rolande said, well, you’re truly doing archeology here. You are the Indiana Jones of Safety. Well, I love most of the movies, so I thought, well, that’s a cool tagline to have on LinkedIn. I kept it.

Definitely it is. Tell me a little bit about the early history of safety because now we speak about safety on a regular basis, but there was a point in time when people didn’t talk about safety. So, tell me a little bit about some of the origins of how safety came to be part of running an effective workplace.

Yeah. It’s quite fascinating to dive into the subject. Safety has always been something people cared about, of course. Even if you look in the Old Testament, there are some safety rules there about building some railing on your roof so people wouldn’t fall down, and you would get a lot of guilt on your head and so on. And the Babylonians had some safety rules, but I think safety is a profession. Then we go back to the late 19th century. Well, the Industrial Revolution was really going on, and a lot of changes in society, new risks, and so on. And people started getting interested in what we do because they saw bad working conditions. They saw people getting hurt, and people died. And what I would like to focus a bit about, because this is a very broad subject, of course, is the understanding of how, especially for America, insurance played a big role, because, from today’s point of view, a lot of people say, oh, all the early safety work was done by people from insurance. They did this with a monetary goal to make money for their companies, which, of course, is true because insurance companies exist to make money, and they were to make money for, well, stakeholders.

But the early motives of safety, I would say it started first with some social outcry that people were really shocked to see how the situation’s changed and workers, the circumstances and high fatality numbers running in the thousands for railways, for example, which is totally unbelievable from today’s point of view, mining, which was very dangerous, still is a very dangerous occupation, but then, well, mine explosions and hundreds of fatalities and people reacted on that. So, you had this social part, but the social part was not enough. Then, there came some regulation, but very slowly, especially in America. Europe was a bit more regulation-driven in it. And then you had, of course, the humanitarian aspect that some employers realized they had to do something because they had a duty of care for their people. But the real game-changer was financial. When worker compensation laws entered the game, they changed the whole scenery because, before workers’ compensation, the cost of accidents was for the employee. If you got an accident, you couldn’t work, you were home and, well, you didn’t eat, basically, and your employer would get someone else, and well, maybe you could come back when you were well again. 

You could sue your employer, but the chances that you won were very low. And especially if there was the slightest hint of some responsibility on your part, like you had done something you probably shouldn’t have done, which there almost always was, and then you had no chance at all to win. And even if you won, probably all the money would go to your lawyer anyway. The cost for accidents was basically for the employee and investing in safety in the early days of safety, I didn’t pay. Then came workers’ compensation, and that changed the whole game because, all of a sudden, the cost of accidents was for the employer. Because the employer had to pay you when you got injured, and you couldn’t work, you still were paid, medical costs were paid, and so on. Employers started to buy insurance against this stuff. That comes the important role of insurance in early safety because it was especially insurance firms with a big safety staff, which they would lend out to employers company owners to do inspections and make recommendations. And then, if you followed the recommendations, you might get a lower premium. So, safety suddenly paid back.

And that was what I think really set in motion early safety. And it’s interesting then to see that also I study a lot of the work of Heinrich, Herbert-William Heinrich, one of the safety pioneers. His breakthrough team was financial. What made him a name was, and we’ve probably all heard it, and it wasn’t his discovery at all. What he did was study it in a more systematic way, add some numbers to it, and then it got a lot of credibility, and people said, Wow! For every accident, there is a direct cost, like the medical care, and you have to pay the injured employee, and you have to get a replacement, and so on. But there’s also a lot of hidden costs that you don’t see, like production stops and there’s disturbance, and people talk about it, and you have to investigate and blah, blah, blah, all that. Heinrich found that there is, at the time of his data set, approximately a 1:4 ratio. People started, well, our safety pays even more. So, if you invest, then that is a big driver for early safety. And not just that, I think there was also a lucky timing. There had been World War I when efficiency was highly recommended because you wanted to produce to win that war.

And there was, of course, the Great Depression, which made it also very lucrative to be safe because that helped you to be competitive. I read quite interesting article that stated that the Great Depression was actually very beneficial for safety because all the workplaces, or not all, but many workplaces that had bad facilities and bad maintenance machinery and so on, went bankrupt. They just went out of service, and they were never used again. People started with, well, better stuff after the Great Depression, I think that there was. I backed it up with numbers, so I thought that that was quite unexpected, actually, because I think from our experience, we often see that things go bad. We have to do things cheaply. Where can we save the cost? Do a bit less maintenance, do a bit less training, and that’s not good for safety.

No, it’s not. From there, one of the things that really struck me when we spoke earlier.

It is really the evolution of systems thinking and where it came about because it’s often associated with a modern view of safety. But what you were describing is that in many cases, systems thinking, there were early elements very early on.

Yeah, what I said, I shared a lot of those early insights, and I wouldn’t say that it was really system thinking yet, but yeah, okay. You see early seeds of stuff that people probably weren’t quite ready for, so to speak. But you see insights which you just can’t copy now to the 2000s and say, this is what we are dealing with now, or that’s what people with a newer view like Hufnagel or Decker and so on are saying, but you find similar stuff already in the 1920s.

So, the new view of safety is not new.

The new view of safety is a different new one, I think. I would say Heinrich and his contemporaries were a new view at the time that I revolutionized quite a bit. You had the first wave of safety, Pioneers, and I won’t bother you with the names because nobody knows them anyway, but they were very focused on machinery and guarding and that stuff. Very basic safety work. There were a lot of low-hanging fruits at the time. And you see that the first safety books around 1900 till, say, the First World War, they were very much how to create safer workplaces by illumination, by guarding ventilation and a bit of organization too, but very little. And then, in the mid-20s, there were some safety thinkers like Louis de Block, our first Vice President of Safety of Dual, the big chemical firm, who wrote a ground-breaking book, and Heinrich drew a lot on that work. Heinrich specifically produced a more management-oriented framework. So not only how to guard machinery, but also how to build an organization, how to learn from accidents, how to better investigate accidents to approach safety in a more scientific way, so to speak, a fact-based way looking at what is actually happening and where should we focus and then not just have a blanket approach.

That was really a new view at the time. Now, looking back 80 years later, we say, Well, that’s just traditional safety. We’ve been doing that for decades. Now, we need something new, and we shouldn’t just focus on what is going back, but we also need to focus on stuff that’s going right, especially at normal work. And then you look back in the old books, and you find already nudges of that, which is quite fascinating. I have a very cool quote here.

I would like to read it to you, and it’s by a guy with an interesting name Albert Wurz-Witney. And he was quite a hotshot in insurance as most safety pioneers of the time. And I think this is from something he wrote in 1921, and he called it This was his philosophy of safety. And I quote, Now, life is intrinsically dangerous. Life is partly routine, to be sure, but more fundamentally, it’s an experience of the unknown and, hence, based on adventure. It’s quite fascinating. He stresses that unknown uncertainty, which is only in recent years getting the role in risk through ISO 31,000, but he stresses already the uncertainty here. And then I find it fascinating that it says, well, life is based on adventure, and that’s cool, that’s risk seeking. He goes on to say the prime quality and safety, therefore, is not the removal of danger but it’s the improvement of the quality of the adventure. That’s wow, this was 1921. And this guy is saying safety isn’t about prevention. Safety is about having better adventures, which I think is absolutely in sync with the resilience engineering thinking that speaks of, well, we need to prevent, of course, and reduce risk and control hazards and all that, but especially, we need to be better at handling variability, which very well resonates in my head with having better adventures to succeed better.

I think that’s such a lovely quote. This guy was very much into getting safety into education. If I understand well, he has been very important in getting a role in American schools and, afterward, the creation of safety courses and a non-university of two. But that systems view might have been discussed, but in terms of its starting to be operationalized, it is definitely a newer function.

Yeah, it has taken many decades, I think, to mature that this is broadly recognized because you find those Nuggets. But safety culture is the same. Du Block, the guy I discussed earlier in his book, discusses what we would now describe as safety culture, but he uses a different term, safety atmosphere, and he has a definition that is quiet. I think it’s very usable because he speaks of some invisible force that affects even people who are entirely new in the company, which is what culture does or is. We won’t get into that discussion now, but it would take until the late ’70s that Culture as such entered the safety discourse.

This episode of The Safety Guru Podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, re-energize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at

You touched on this earlier, Heinrich. Let’s revisit Heinrich because he’s often quoted with the pyramid, which is now disputed. But the pyramid is not, as you described it to me, is not quite what it was intended to be. Tell me a little bit more about Heinrich and the pyramid and some of those elements.

Firstly, I never speak of Heinrich and the pyramid because Heinrich never drew a 3D shape, which is a pyramid. The pyramid is three-dimensional. Heinrich made a triangle, and it was only Bert who made the fancier picture. But there are a lot of misunderstandings. I think it’s Heinrich’s at least one of his most famous concepts, but has started a couple of lives on their own. People are treating it, for example, as a law. Some people still believe, even today, that there is some natural law dictating that there should be one accident with major consequences, 29 accidents with minor consequences, and 300 accidents with no consequences at all. I’ve actually been in a meeting many years back now where someone and I were discussing reporting and underreporting. Someone stood up and said, Well, I’m sure we are having a problem with reporting in our company because we don’t live up to this ratio. I thought, Oh, dear. Oh, dear. Oh, dear. Because if you study Heinrich’s work, you will see that… And he’s quite clear about it. He says often this is an average. And then, if you see how he got to this average, you will see that he studied different kinds of accidents, where they then found and estimated the ratio for that accident type.

So, in his book, he has, for example, a case where somebody is cutting wood on a circular saw, and he pushes the wood through the saw, and at some point, he does it without a push thingy, and he cuts his fingers, and then they find this ratio one to something. I don’t know by heart. And then, he has another example where he describes someone on a daily basis crossing rail tracks because it’s a shorter way to work. And then, he finds a ratio of one severe accident to several thousand incidents without any injury. And then, he has a couple of various scenarios that he describes, and then you can see that for each scenario, there’s a different ratio. Then he averages them up, and he finds or constructs a very neat number, 129, 300, which is a ratio that you won’t forget. That’s a stroke of genius. He repeats this number, and he anchors in the message that there are a few series of serious accidents. There are more not quite as serious accidents, and there are a lot of near misses in modern language. He doesn’t use the word Neomi’s yet.

He speaks of no injury accidents. Then he says, after having anchored in this message, he says, here lies great opportunity. If you recognize that you could have prevented or if you just recognize you can prevent worse stuff from happening, then you can act proactively. That’s his great gift to safety, I think. This realization is that we don’t have to operate reactively. We don’t have to wait for someone to be injured. Of course, if somebody gets injured, then there is a greater sense of urgency that we should do something to prevent this from happening and make improvements and so on. But his message was you don’t have to wait until somebody gets hurt. You can actually be observant. See that… Sorry. Oops, this could have been much worse, and you can react to it. His message was actually one of opportunity and not one of counting, which a lot of people make. A lot of people say it’s metric. No, it’s not. It’s just a metaphor for opportunity and proactivity. And that’s one of the things that a lot of people credibly get wrong. They start counting, and then they mesh all kinds of accidents together. A lot of the literature on the triangle or the pyramid is based on either within the sector or within the country.

Or if you are lucky, whether within some process, but they don’t stick to one specific scenario. Because if you want to play with numbers, you have to have it within a scenario, which is very similar, which also said this just works within. He calls it a unit group of similar accidents.

If you start blending together, and this is one of, I think, Todd Conklin says it a lot, ankle sprains don’t say anything about wall blowouts. Of course not, because they’re two very different types of accidents. One is not predictive of the other. But slippage may be very well predictive of broken legs and ankle sprains, and bad maintenance or mechanical failures in your blow-up preventer are likely predictive of the well blow-up. But don’t mix them together. There are a lot of misunderstandings that people focus too much on the numbers and the correctness or the ability of the ratios, which are totally irrelevant because they’re just an illustration. People think that it’s predictive. If you have had 299 and you miss, then probably next is up, which is quite foolish, actually. And Heinrich himself said, well, it may be also the first where you get hit. Sure. There is a factor of randomness there. One of the main mistakes is that people don’t stick to the scenario. The predictive element, if there is one, is only within one scenario.

I think the other piece I’ve seen is a lot of organizations start relying on that pyramid or the triangle, as you mentioned and start thinking that if I focus on very small injuries, I’m going to reduce serious injuries and fatality. They’ll focus on the same amount of attention on first aid or a B-sting or a slip, trip, or fall versus the elements that will drive a serious injury and fatality are probably quite different. That’s where I think there’s been more recently a shift of thinking and realization that it’s a subset of those that can drive to serious injuries.

Definitely, There isn’t one pyramid unless you want to calculate the average, which is fun to do, maybe for safety nerds, but it’s of no practical value. You have to see the pyramid or your pyramid as a huge stack of different pyramids. You have a slip, trip, and fall pyramid. You have a well-blow-out pyramid. You have a paper cut in the copy room pyramid, which probably has a crazy ratio of one to a trillion or something. And then you have pyramids, which aren’t pyramids at all. I’ve worked for 20 years in railways, and I used earlier this example that Heinrich mentions of somebody getting hit by a train. There isn’t probably a pyramid shape there. It’s probably some hourglass shaped where you have fatalities at the top, then almost no minor injuries, because if you get hit by a train, you typically either it’s a tiny or a miss, which there are a lot of, or you are probably quite damaged. There’s not a lot of first aid in those cases. It’s big on the top, and it’s big at the bottom, and nothing in the middle. Out. That’s the hourglass.

Sure. I’d love to pivot to James Reason and the Swiss cheese model and love to hear some of your perspective from a historical standpoint.

James Reason is one of the other safety artists who has a really brilliant metaphor that anchors and you see it and I think most people can intuitively connect to it and make sense of it and then give their own interpretation of it. And that’s what we, for example, have seen in… I don’t know if you’ve seen them, Eric, but I got quite fed up with COVID times. All the pictures shared of these COVID protection Swiss cheeses, where you had 17 layers stacked up and then people having some a story around it. It’s quite interesting to reflect a bit on how this happened. Because these COVID-19 Swiss cheeses, I think, they stray quite far from reasons idea. I think they’re the three main categories of why people are getting models wrong. The first is they just don’t know better. Sure. And for whatever reason, and then we can talk an hour about this, I guess, but I think here, say, is one big factor. You’ve had a course, and somebody told you his interpretation of Swiss cheese, and then you pick up some parts. And then we basically go to the second reason.

You start making your own interpretations. That’s a quite powerful one, and it’s for the better and worse. Let’s just be clear about that because if you see this picture, a couple of barriers with holes in them. And if something goes through the holes, then things have gone very wrong, and you have an accident. That makes immediate sense, I think, to a lot of people. And especially the lot of barriers make a lot of sense to many people. And then I start ignoring them. Probably, they don’t even know that the message or James’ reason was much more complex than just the picture because the picture comes with a lot of text and a lot of explanations and pathogens and complex systems and organizational factors and human factors and you name it, people just see the picture. They think, Well, how can I use this in my situation? And then they start, well, just take the picture, give their own twist, and that’s all there is to them. And then there is a third group, which are the people that have motives of their own by interpreting a model their own way. The Swiss cheese model has gotten a lot of bad rep in the latter years, especially, let’s call them, New View Safety Thinkers, who call it the linear model, which I would say that that’s not correct at all.

The picture looks linear because you have all these slices stacked after each other. But the picture isn’t all there is. If you read the text and go back as far as the first presentation in 1990 in Human Error, the book by James Reason, and you read the text that comes with then he says quite interesting stuff like the holes in the barriers. They’re not static. They’re moving around. They open up, and they close, so they change shape and so forth. The model is not linear at all. The model is quite dynamic, even though the picture looks very linear. Right. I think some people also give a twist to models to make their own message look better. You come down with a different model, and I won’t name any, but which perhaps takes better care of the systemic factors, and then there are models that do that quite well. And if you contrast it to Heinrich Domino’s or the Swiss cheese, which looks very linear, then your model probably looks better. And there may even be, let me say, a pedagogical angle to it that you stress the linear aspects of the Swiss cheese to communicate better about the systemic approaches.

Sure. Yeah. That’s just some quick reflections based on the Swiss cheese and then how people give their own twist to it. Then one thing I would like to stress and something we perhaps need to work on is I started by saying a lot of people don’t know better. They’ve been on the courts, or they’ve seen a presentation where somebody had 15 seconds to say something about the picture on the slide behind them. You get this quick explanation of the Swiss cheese, and you think, well, this is a quite knowledgeable guy standing there. He explained this, and then this is all there is. And we’re not typically trained to ask critical questions and then go back and read the literature and study it. And we probably don’t have time either, a lot of us. But you may end up with a model that doesn’t quite do what actually you should be doing.

So, thank you, Carsten. You’ve shared a lot of background history from the early days of safety to some early thinking around the broader systems view around safety to revisiting Heinrich and Swiss cheese model. I think there’s an important lesson in what you’re talking about in terms of models. At the end of the day, the model is there to share an idea and a concept. I think that’s the important element of the model, as opposed to thinking it’s pure and true and depicts everything. Is that a fair assessment?

Yeah, that’s a good summary. Models are always a simplification of something, of course, and we have to understand the limits of that simplification and the limits of the model. We need to ask a bit more often, probably, firstly, is that what the model was designed for? And then we can, of course, use it for other stuff that may be actually beneficial because innovation builds on that. But it’s wise to check: Is this actually what the model was meant to say? Because the Swiss cheese is not about 17 layers. 17 layers may actually be safer than ten layers because new layers introduce new complexity and side effects, and so on. Go back to the source and ask a critical question at least once in a while. I think that’s important.

Thank you very much, Carsten, for joining us. If somebody wants to read more, hear more, how can they get in touch with you?

I have a website, www., and there’s probably a contact somewhere there, and you can find me on LinkedIn. Relatively active there, so just reach out and connect.

Excellent. As the Indiana Jones of Safety. Thank you.

Thank you for listening to The Safety Guru on C-Suite Radio. Leave a legacy. Distinguish yourself from the past. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes:

C-Suite Radio:

Powered By Propulo Consulting:

Eric Michrowski:


Carsten Busch has studied Mechanical Engineering, Safety, and Human Factors. He has over 30 years of experience in Safety and Quality Management at various levels in organizations ranging from railway to oil & gas to police in The Netherlands, the United Kingdom, and Norway. He is professionally active on various forums, a regular speaker at conferences, owner of, a tutor at the Lund University Human Factors and System Safety program, and author of several professional books: Safety Myth 101, Veiligheidsfabels 1–2–3, If You Can’t Measure It… Maybe You Shouldn’t, Preventing Industrial Accidents, The First Rule of Safety Culture, Risicoflectie, and recently an annotated republishing of safety pioneer Heinrich’s papers from 1923-1945. His main research interests include the history of knowledge development and discourse in safety, which has led to Ph.D. work through Open Universiteit. He is an active member of the Dutch Society of Safety Science (NVVK) and a member of the editorial board of the society’s quarterly magazine NVVK Info. He is a reviewer for Safety Science and the Journal of Contingencies and Crisis Management.

For more information:




Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at
Executive Safety Coaching_Propulo

Soaring with The Blue Angels: Building a Robust Safety Culture with Scott “Intake” Kartvedt

Soaring with The Blue Angels: Building a Robust Safety Culture



Get ready for takeoff on The Safety Guru podcast! In this episode, we’re soaring to new heights alongside an experienced professional pilot, the stunt pilot from Top Gun: Maverick, Scott “Intake” Kartvedt. He shares the foundations of a robust safety culture, highlighting key strategies incorporated by the Blue Angels and the Navy. Gear up to elevate your organization to a top-tier safety culture. Don’t miss this flight!


Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy’s success story begins now.

Hi and welcome to The Safety Guru. Today I’m very excited to have with me Captain Scott Kartvedt, I should call him Scott “Intake” Kartvedt. He is a former fighter pilot. He was with the Blue Angels as a commanding officer and also a stunt pilot in Top Gun Maverick. Scott, thank you so much for joining me. Quite an impressive background. Welcome to the show.

Eric, thanks for having me. It’s a pleasure to be on and I look forward to talking about safety and some of the challenges that we face as human beings in all workplaces. But it’s a topic that you just can’t beat the drum enough to keep our peers and our fellow workers and human beings safe.

Excellent. Well, let’s start with a little bit about your background because it’s quite an impressive resume. I think every boy’s dream growing up. So, tell me a little bit about your background all the way into the blue Angels.

Yeah, absolutely. So, like so many people my age, I saw the movie Top Gun, the original when it came out in May of 1986. And my best friend and I told all of our friends that we were going to be fighter pilots. And subsequently, we both went to college. I worked as an accountant for a while, about a year after school. And my best friend, Bob called. He went to ROTC with the Air Force. And he said, hey, I got my pilot slot. I’m doing it. We said we were going to do it, and I’m doing it. I said, okay, I’ll do it too. I picked up the yellow pages and called the Navy recruiter, joined the Navy as a pilot, ended up going through flight training, was successful, selected jets, and was able to select F-18s. That was the start of becoming a fighter pilot. I f from that point was just cutting my teeth as a fighter pilot, using the weapon system, which was the F-18. I was forward deployed in Japan. In the Taiwanese contingency operations, we were the China watchdog, the North Korea watchdog. I came back from Japan and was an F-18 flight instructor and also a landing signal officer.

For your listeners, the landing signal officer is the pilot that sits at the end of the aircraft carrier and is an aid or a safety spotter, if you will, to ensure that the planes are coming in on glide path, line up, and in the proper angle of attack or attitude of the aircraft when they land on the ship. That was foundationally where I recognized the need and the importance of safety. I went with the Marines to teach the Marines how to land on aircraft carriers at Marine Corps Air Station, El Toro. We subsequently moved to Mir Mar with the Marines. Then I was selected to become a member of the Navy’s flight demonstration team.

Tell me a little bit about the Navy and the discipline that comes in the Navy, but also in the aircraft carriers. What you describe is just landing a plane on an aircraft carrier, very difficult. You’ve got an 18-year-old that’s starting. How do you create that discipline where no matter where around the world you are deployed, you’ve got a consistent operation and safe operation?

Yeah, that’s a fascinating cultural safety ownership culture that the Navy is exceptional at because we take 4,000 sailors and we put them on the most lethal platform in the inventory and aircraft carrier. And there are, let’s say, 50 airplanes in addition to helicopters and E2s, which are propeller-driven airplanes, OSPRIs. So, it is a very high-risk environment. And you have young men and women who may have only had the good fortune of receiving maybe a GED, a General Education Degree, maybe academics wasn’t their thing, or they may have been high school, or I’ll just say school dropouts. And how do you create a culture where you instill ownership of each other and the ship and the airplanes in someone that is 18 years old and first deployed on an aircraft carrier? You have to give them ownership. You have to give them the authority and the responsibility to stop flight operations. And the example, though, to give you, Eric, and this is true on all aircraft carriers. If an 18-year-old who might be a plane captain is on the deck of the aircraft carrier and look in their toolkit and see that they might be missing a tool.

Sure. We have to have a safety culture where they don’t immediately think, oh, I’m going to get in trouble. I need to hide the fact that I lost this tool and I hope I find it later. We need them to raise their hand immediately. We actually have them put their hands up in an X in front of them, like a giant X, to verbally or nonverbally communicate to everybody else to stop. And as soon as you see somebody that is putting the nonverbal X because it’s loud on an aircraft here, it’s got to be nonverbal. As soon as you see that, everybody else does it. And then all operations stop and we find out who created it or stopped operations. We run over to, in this scenario, the 18-year-old, and we say, what happened? They say, oh, I lost a tool or a wrench, and I think it might be in that F-18 because that’s the airplane I was working on. You have to have a culture where you say thank you. Thank you for your courage and the integrity to stop operations because as soon as you punish that individual for their lack of responsibility, now you start hiding those small safety splashes that over time can build up into a catastrophic failure or loss of aircraft or fatality.

We are very good in the Navy at providing authority, responsibility, and ownership at all levels, from the captain of the aircraft carrier to the 18-year-old, and instilling in them the ownership of the airplanes, the ship, and the people.

How do you drive that? Because it’s easy to say that. A lot of organizations talk about it. I lived in the aviation space, where that’s expected as well. But in a lot of other industries, there’s always a questioning element. If that decision, if I stop work and I say I’m not prepared, and if I had to make a mistake as part of it, and there’s a repercussion, which in some cases in business can be hundreds of thousands of dollars, it’s very tempting to go, Let’s hide it. Nobody will figure it out.

Right. One, you not only have to say it, you actually have to believe it. It might take a period of time. When you take command of a fighter squadron or a ship in the military, it’s only for a short period of time. You’re not a CEO for 5, 10, 15 years. So, you have a short period of time to establish your culture and instill your values in your belief structure, the integrity, the principles, and the character that you want to set for the organization. And it has to happen pretty rapidly. And so, you not only have to say it, you actually have to live it. And so, an example that I will give really quickly, and this was the year that we won the Safety S in the F-18 squadron that I had command of, we went 486 days over the course of two deployments with no alcohol-related incidents. The same safety ownership culture that we had on the ship and in the air wing, I wanted to instill off-duty so that when we were in port, we were still taking care of each other in a safe environment. I said, look, if you’re going to go out and have a hoot nanny and do a little bit of drinking, one, we have to watch out for each other.

Two, don’t drink and drive. Don’t drink and drive. Don’t drink and drive. Don’t drink and drive. And if you get a cab, I will pay for it personally out of my pocket. Nongovernment money, my money. You bring the receipt in, and I will immediately stroke you a check or Venmo you in this case. And sure enough, one of our sailors came in on a Monday, handed me his receipt for 50 bucks, and I Venomed him the money. I immediately stopped operations, called everybody together, and honored him for doing the right thing, which was taking the cab. But I also had to back it up with my actions and do what I said I was going to do to prove to them that it wasn’t me really seeing if they were drinking. I didn’t care about that. I wanted them to live their lives, but I had to back it up with action. I really think that… And not patting myself on the back because it took 250 of us to earn that safety S, but it was that culture of living and doing what we said to take care of each other. Once you have that culture, then somebody new shows up, an 18-year-old who just checks into the unit, and that’s the culture that they…

And then it can live on until another leader comes in and either makes it even better or for some reason erodes that culture.

I agree. And how does training come into the equation? How do standards and expectations complement this? Because there’s more to just saying, these are the values, and I need to stop working.

Yeah. So, let’s pivot to the blue angels a little bit because they have the highest standards of any organization I have ever been a part of. And they hold each other accountable to those standards. And it’s really as simple as the pens that the autograph pen or the paperwork pen that we keep in our blue suits have to be in a very specific pocket. When we talk to people in the crowd line, we can’t wear our sunglasses. We have to make eye contact. There are little small things like that that they don’t necessarily tell you right up front, but it costs you $5 if you fail to meet the standard. When you first join the team, it costs you $50, 60, 70 a day. But then you learn exactly what the standards are. It takes a very short period of time to realize that what they’re teaching us is discipline and attention to the minute detail. Not only for the pilots, because we need that attention and detail when we’re flying, but our mechanics need attention to detail when they’re working on planes. The supply core needs attention to detail when they’re ordering the right parts.

Our administrative department needs attention to detail when they’re submitting the paperwork so our sailors get paid. Everybody has to have that attention to detail and that service, the customer service to each other, and hold each other to that standard. With that comes the debrief, Eric. You have to be able to debrief somebody when they don’t meet the standard. One, you have to have the standard set. This is what we expect. And then, if somebody doesn’t achieve it and there’s a gap between the expectation and the performance, you have to be able to debrief that. Most human beings, and I talk a lot about this when I consult companies, there’s an ego problem there, where people perceive the debrief as some form of punishment, and they get defensive to have failed to meet expectations. And on the blue angels, I realized that somebody wasn’t punishing me when they debriefed me or telling me that I was incapable. In fact, it’s the exact opposite. When someone takes the time to debrief you up to the standards, they’re actually telling you that they believe that you have the capability to achieve the standard or exceed the standard.

And once you realize that when you’re being debriefed, it’s because somebody believes in you and they know that you can perform at a higher level, then you can’t get debriefed often enough. You crave that feedback to improve and accelerate your performance.

Very similar to the concept of radical candor as well of, if I care about you, then I’m… And I believe in your potential that delivers feedback differently. But you’re absolutely correct. Many times I’ve seen conversations even between senior executives where they’re giving feedback on how to improve, and then they’re trying to justify as opposed to just saying, You’re not losing your job. It’s not impacting your performance bonus. This is just tips and ideas on how you can get better. What you describe is really key. It’s really how you have the conversation so you get to your optimal version of yourself.

Yeah, absolutely. In that, when somebody is debriefing you, the only really appropriate response is, Thank you. We immediately… It’s our human behavior to want to defend. Eric, if you were debriefing me on something, I would want to hear you, and then I would want to defend why I made the decision that I made, or explain to you what happened. That just takes time and gets into what we would call a circular conversation because now I’m defending myself. I could just say thank you, and I can take your input, and I can make myself better. Or if the feedback didn’t meet the scenario, then I know that, but I don’t necessarily need to explain that. I just need to take your input, recognize that you believe in me, let go of my ego, and then choose to incorporate it if I believe it will help me or improve or not. I hate to make it that simple, but the ego piece is significant for sure. Once you let go of the ego, then you can really, really, really accelerate your performance.

You said something a few minutes ago that really caught my attention. I expected when you talked about setting high standards in the blue angels, I expected if you didn’t do something, there would be some form of punishment. Instead, it’s the $5, which is often a ha-ha joke, but still sends the message. Tell me a little bit about how it’s done, because I’ve seen this where somebody would, as an example, every time you were late for a meeting, it was a buck a minute for your delay, and it went to charity. So, it wasn’t for profit to somebody, but it sent a message very quickly as opposed to chastising somebody for being five minutes late, embarrassing them. It was just a donation jar, but it drove the message very quickly.

Yeah, I think it does drive it quickly. And so, the $5, when you are hemorrhaging money to learn this, that is a behavioral tool, right? The carrot and the stick. It’s a little bit more of a stick model. Our money went to quadrant social functions. The debriefs are never a personal attack. It is just professional development. But it’s interesting. That dollar was being laid to a meeting, and this was another great thing that I learned in the Blue Angel. The briefs always started on time. The debriefs started on time. When the time started, that’s when the meeting started. The idea was to respect each other. There were 16 officers on the team that were at every brief and debriefed. If you waited one minute for one person to show up, you really just wasted 15 minutes because 15 people showed up on time. I took that philosophy into F 18 Command, and I would set up operational meetings that we had consistently every week, safety meetings, operational meetings, and maintenance meetings. We would always start them on time. I made the department heads that work for me crazy initially because they said, well, not everybody can be there.

I said, well if we wait until everybody can be there, it’s going to be a month from now. They can send a representative, which gives a depth of leadership and provides training for support sense. I said, Just because they can’t be there, that’s okay. But they need to at least send a representative. All you have to do is start on time once or twice, and then the person walking in late will realize that when you say you’re starting on time, you actually mean it. But as soon as you say, hey, let’s wait for everybody, you’re wasting the time of the people that were on time at the expense of the person that was tardy. So, whether it’s a dollar or just, hey, hack, the time is 930, and we are starting, that gets the point, and the whole command or organization will pick up on that.

This episode of The Safety Guru podcast is brought to you by Propulo Consulting. The leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at

In the Navy, as well as the Blue Angels, training is a huge component of onboarding. What’s the rule of thumb around training? When is it too enough? Is there such a thing as enough, not enough? Because a lot of organizations struggle with the training as a cost, right? And they’re trying to minimize the cost of that investment. Not the case in the Navy, not the case in the aviation space. So, tell me a little bit more about that.

Yeah, it’s interesting. I think, and not necessarily even high-risk organizations, but let’s talk about companies that do high power lines or high voltage electrical work, railroads, the airlines, those organizations, large organizations that have to train to maintain a level of safety. I’m sure you’ve talked a lot about the normalization of deviation on your podcast, right? And so, the balance between training towards perfection in safety and the expense, at what point does management or the people that are responsible to the shareholder say, well, nothing has happened, so we are training good enough. And now you are prioritizing shareholder budget return on investment over safety. And that’s when you really need to start listening to the people that are actually doing the work and ask them what they need to find out where the safety gaps are, because they will tell you for sure. And so, it’s a really fine leadership balance between, can you really be over trained? Probably not. The Navy Seals would say, absolutely not. You can’t be trained enough. But at some point, you actually have to stop training and operate. But even in operations, there is an opportunity to learn and take what you’re learning from the operation, wrap it back into the training so that you can minimize the risk while also improving the performance and the conclusion of the organization.

When I think of training, one of the things that to me is apparent, particularly in aviation, compared to what I see in a lot of businesses is, often, people see training as a one-time thing. So, it’s onboarding you, and I give you initial training, essentially. What I see in aviation, and I’m assuming in the Navy, is exactly the same, if not even higher, is this continuous training. So even if there was a near miss, as an example, if it gets to a certain threshold, you’re going to run through simulations that will recreate what happened to somebody else at some point. So, tell me a little bit about that, because that refresher piece to me is really key to focus and learning, but also not getting complacent.

Sure. What’s interesting about that, is that commercial airlines have to train their pilots. They come through a training center for simulator training every nine months. I think most non-aviation people would be blown away to know that the pilot of their group goes through training every nine months, two days of training, to go through what we would call nonnormal, nonroutine scenarios and even extreme scenarios. So that in the event it ever actually happened, they would have some muscle memory, some procedural recall to overcome the amygdala hijack, and the startle effect because you have to override the fight-flight or freeze. Aviation is great at that. Continuous training is really important. It drives the point home. There’s always something that you can learn from it. I think that aviation philosophy is spreading. I know that the health industry and surgical units are taking on board the idea of aviation briefs and debrief checklists to ensure things are done correctly. I know that there are a lot of industries that do that. But think about straight-up corporate America. I’ll just take some finance organization as a hypothetical, right? Maybe not high risk, but they do continual training where they are talking about diversity, equity and inclusion, sexual harassment, and those things that have to be continually brought back up to the forefront of the mind to ensure that people are not continually thinking about it, but trained to a level of awareness that is important.

I think the concept of continual training, as long as it’s refreshed, I think that’s an important piece because you can’t just play the same video a year ago because nobody will pay attention. It actually has a negative effect. But using real-world examples in your scenarios, which breeds transparency, lens, and credibility, where everybody can learn from something that happened in your organization, that’s the best time to continue the training for any organization.

Phenomenal topics. I think that the element of training in terms of what you describe is something, at least for high-risk roles, I think is important to do that refresh in terms of refreshing. Then the other element is the scenarios, the working through scenarios where something went wrong as opposed to just getting an email saying, hey, so and so had this issue, and this is how they dealt with it. It becomes a recurrent training and walking through different scenarios, I think, is key.

Eric, I have found that facilitated experiential training, even if its scenario-based, trumps computer-based training, and certainly emails all day, every day. People will generally respond to that in-person, facilitated, roundtable experiential training because now they’re learning from each other and sharing their stories. Once you get people sharing their stories, we’re good in aviation, right? There I was. But there’s a tremendous amount of learning that takes place in the three I was the type of scenarios.

You touched on something briefly a few minutes ago around safe today, not tomorrow. Tell me a little bit more because I think that is something many organizations struggle with. Because in safety, often there’s an absence of a leading indicator that tells you when you’re when this deviation that’s starting to be normalized in the process. Tell me briefly about what you mean by safe today, not tomorrow.

Yeah. So, you could have a level of training that is degraded due to cost due to budgetary constraints. And at the end of a quarterly result, you could say, Well, our safety record is still 100 %, and we reduced our budget. Therefore, we’re training to the proper level. And so, maybe we could cut a little bit more and save some more money on training to help our bottom line. I have worked with companies where I have seen that happen. And you can hear the rumblings among the workers that are actually performing the high-risk jobs. And as soon as that happens, you know that you have a gap, and you need to listen to them to find out what they need. And so the answer is that this is good enough or it hasn’t happened. Therefore, we justify the budget cut to the training department or to learning development. That’s a normalization of deviation where, just like the space shuttle, the rocket booster had had a ring leak 14 times, but it had never exploded. Therefore, the risk of explosion was minimized when, in fact, that was not the case. Just because you flip the coin 10-times and it lands on heads doesn’t mean it’s going to land on heads the 11th time.

The risk is the same on the 15th launch. And that’s when the O ring failed, even though there were people screaming about that problem. And I’m sure you’ve analyzed that a lot. But that normalization of deviation, you have to step and make sure that you’re not falling into the cognitive bias trap where plant continuation bias, overconfidence bias, the expectation bias where it’s worked before, therefore it will continue working. I think as a leader, we have to step back and go, okay, where is our risk? And have we cut back too far? What’s the risk to the operation? And if you want to know where the risk to the operation is, go talk to the operators. They’ll tell you exactly where the risk to the.

Operation is. I think it’s a really important point because it’s not you can’t save money, but you’ve got to save money in the right places. So, it’s not that you have to be the highest cost operator, but the flip side is the lowest cost operator isn’t necessarily the answer. Because I’ve heard somebody say, well, in this particular industry, the lowest cost operator is the safest. And I’m like, But that doesn’t mean it’s a correlation. That doesn’t mean it’s causality. It just means maybe they’ve got very good operational discipline and are good at it. They may be lower cost because of that operational discipline, and they’re tighter on safety. But you can also arrive at the lowest cost through cost-cutting, and we know what goes horribly wrong with that.

Yeah, absolutely. Causality they try to tie two things together that actually are related. And on that piece, I would tell the leaders that are listening to the podcast to go to the same operators and say, where can we cut costs? What do you recommend? Where’s the excess? They’ll tell you. They’ll tell you what they need, and they’ll tell you what they don’t need if the leader is willing to listen anyways.

So, tell me about your book, Full Throttle, From the Blue Angels to Hollywood Stunt Pilot. Tell me a little bit about why somebody should pick it up.

That book. Yeah. Well, I appreciate the book plug. I have had a very fortunate career, as we have talked about here. When I got asked to fly as a Stunt Plow to Maverick, the most common question was, how did you get to do that? And over the course of my career, how did you get to fly F 18s? How did you get to fly for the Blue Angels? How did you get to go on five combat tours? How did you get to stand up the first stealth fighter squadron in the Navy? How did you get to fly for Maverick? I got asked that enough that I had to boil it down to really three things. I say yes to opportunity because the same yes opens doors, and I am not afraid to learn from my errors. I talk about embracing failure. It’s really about embracing mistakes and failures, letting go of your ego, and being willing to learn. I asked for help on that same NASA subject because we were talking about normalization and deviation with the Challenger. I actually applied to NASA once, and all my friends said, intake, you’re never going to be an astronaut.

You’re not a test pilot. You don’t have an engineering degree. You’re an accountant. It’s never going to happen. I said, well, let me put it this way. Nasa is never going to call me out of the blue and offer me a position to be an astronaut. So, I have nothing to lose. All they can do is bring good news by saying you’ve been selected to be an astronaut. Because if they say no, you’re not an astronaut, I’m already an astronaut.

Right. So that’s been my philosophy. And then my dad was really the inspiration. He was a Submariner in the Navy. And he always said, Scott, your stories are just outrageous about naval aviation. You should write a book. My dad is still with us. He turns 86 this year, but he’s got the Rage. I thought, you know what? I am going to put pen to paper, and I’m going to tell my journey from having watched Top Gun as an 18-year-old in 1986 to 33 years later flying as a stunt pilot in the sequel and share that journey. And hopefully, people of all ages will find it inspirational, but also maybe take a tool and a life lesson from the book as well.

Excellent. Well, Scott, thank you so much for coming on the show, sharing your experience from the Navy, from aircraft carriers, Blue Angels, to now being a commercial pilot, and your recent book. I really appreciate the time you took with us. These are really great insights in terms of building a good discipline from a very early stage. Thank you.

Appreciate it, Eric. Thanks for having me on.

Thank you.

Thank you for listening to the Safety Guru on C-Suite Radio. Leave a legacy. Distinguish yourself from the pack. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafety Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes:

C-Suite Radio:

Powered By Propulo Consulting:

Eric Michrowski:


Scott Kartvedt was the Navy’s first Commanding Officer of the only F-35C Stealth Strike Fighter Squadron in the US inventory, Strike Fighter Squadron ONE ZERO ONE, based in Eglin AFB, Florida. He also commanded a F/A-18 Hornet squadron during two combat deployments to Afghanistan in Support of Operation ENDURING FREEDOM. While leading the 250 Sailors of VFA-83, the unit was awarded the 2009 Commander Naval Air Forces Aviation Battle Efficiency Award, the CAPT Michael J. Estocin Award as the Navy’s Strike Fighter Squadron of the Year, and the 2010 CNO Safety Award.

Scott is currently a professional pilot and on the Board of Directors for the Blue Angel Foundation. He is an instructor and evaluator for United Airlines in Denver, Colorado, the number 5 pilot for the Patriot Jet Team, the only civilian jet demonstration team in North America, and was a stunt pilot in TOPGUN Maverick. He is also the Founding Partner of High-Performance Climb, a privately held consulting company. Scott shares his executive leadership, risk management, and safety mitigation experience gained during extensive combat operations through Inspirational Keynotes and workshops with clients worldwide.

For more information:

Scott “Intake” Kartvedt Book Cover




Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at
Executive Safety Coaching_Propulo

Cracking the Code: The Human Factors Behind Organizational Failures with Martin Anderson

Cracking the Code The Human Factors Behind Organizational Failures



You don’t want to miss our latest episode of ‘Cracking the Code: The Human Factors Behind Organizational Failures’ on The Safety Guru. Join us as Martin Anderson, a renowned expert on human factors and performance, shares his valuable insights and examples about the human factors behind organizational failures. Learn how to effectively and constructively embed lessons learned in your organization.


Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost, for the C suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi, and welcome to The Safety Guru. Today I’m very excited to have with me Martin Anderson, who’s a human factors expert. We’re going to have a really interesting series of topics of conversation today. He’s got a deep background in human factors across oil and gas regulatory environments. His passion is really to understand how people perform in complex systems and also, ultimately, why organizations fail. So, Martin, welcome to the show. Really excited to have you with me. Let’s get started with a bit of an introduction.

Yeah, thank you very much, Eric, and certainly, thank you for having me on the show. It’s a real privilege to be invited here. Yeah, so in terms of my background, I started off with a psychology degree, and then I did a master’s in human factors. And after a few years of work experience, I followed that up with a Master’s in Process Safety and Loss Prevention. I’ve been a human factors specialist for over 30 years now. I’ve worked for a couple of boutique consultancies. I’ve been a regulator working as a specialist inspector in human factors for the UK Health and Safety Executive. I spent a few years as a human factors manager in an oil and gas company. I spent a lot of time assessing existing installations but also had input into the design of new facilities, working on 40, 50-billion-dollar mega projects. And over that time, I visited over 150 different oil, gas, and chemical facilities, both onshore and offshore, which gave me quite an insight into how some of these major organizations operate. And one of the reasons I created the website, was to share some of those insights. The other thing I’d like to talk about is going back 30 years, right to the start of my career.

I read a document which was called Organizing for Safety. It was published by the UK Health and Safety Executive in 1993. There’s a quote from that document I would like to read out because it had a huge impact on me at that point. It goes like this, different organizations doing similar work are known to have different safety records, and certain specific factors in the organization are related to safety. So, if we unpack that quote, it really contains two statements. First of all, the different companies doing the same things have got different safety records. And secondly, perhaps more importantly, there are specific factors that could explain this difference in safety performance. And I thought this was amazing. I thought if these factors could be identified and managed, then this safety could be massively improved. And over the next 30 years or so, one disaster at a time, these organizational factors have revealed themselves in major incidents, which I guess we’ll come to in a moment.

I think that’s a great topic to get into. So why do organizations fail? Because I think when we had the original conversations, I was fascinated by some of your connections between multiple different industries and common themes that were across all of them.

Yeah, sure. What might be helpful, first of all, because we introduced me as a human factors specialist to just briefly define what we mean by human factors, and then we’ll go into looking at some of the organizational incidents if that’s okay. Sure. For me, Human Factors is composed of three main things. We’re really looking at, first of all, what people are being asked to do. That’s the work they’re doing. Secondly, who is doing it? This is all about the people. And thirdly, where are they actually working? Which is the organization? So ideally, all three of these aspects need to be considered, the work, the people, and the organization. But my experience is that companies tend to focus on just one or two of these, usually the people one. Within the UK HCC, our team defined human factors as a set of 10 topics, which has become widely known as the top 10 used by industry consultants and regulators worldwide. Because prior to that, we would turn up to do an inspection, say, we’re here to inspect your human factors. And they were like, I don’t know what you mean. How do we prepare for that?

Whom do you want to speak to? What do you want to go and look at? So, after creating that top 10, we were able to say, the agenda for the inspection is that we want to come and look at how you manage fatigue. We want to come and look at your supervision arrangements or your competency assurance system. So, this helped to operationalize human factors. So, the other description, really, of human factors. A lot of people come to human factors through human error. They hear about human error. But if we identify human error, we need to understand how and why it occurred and not simply blame people. Are we setting people up to succeed? Are we setting them up to fail? Are we providing systems, equipment, and an environment that supports people to do the work that we’re asking them to do? And to introduce, as we move towards talking about organizational failures, I’d like to read a quote from Professor James Reason, who is a psychologist at the University of Manchester. And this quote is about 25 years old, but it’s still one of my favorites. And Reason said that rather than being the main instigators of an accident, operators tend to be the inheritors of system defects created by poor design, incorrect installation, faulty maintenance, and bad management decisions.

Their part is usually that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking. And I think that’s a really good introduction to our discussion on organizational failures.

So, let’s go there because we had a really interesting conversation on organizational failures and some of the common themes. So, what are some of the common themes, and why do organizations fail?

Exactly. When you say, why do organizations fail? Let’s just think about a few of those from different industries because these organizational disasters have occurred to the NASA space shuttles, the Harold of Free enterprise, Ferry Disaster, Shenandoah, the Kings Cross Fire, Piper Alpha, Caterpillar, Texas City, Burnsville, Deepwater Horizon, the Condo, lots of different rail incidents around the world, several so-called friendly fire events. And there’s also been organizational disasters in sectors such as healthcare and finance. In the UK, these include inadequate care during children’s heart surgery at the Bristol Royal Infirmary over a 10-year period. And, of course, most listeners will be familiar with the so-called rogue trader that caused the collapse of Bearings Bank. So, there were so many disasters in so many different industries. And I know when we had a conversation earlier, what we were considering was that, okay, they’re all in different industries, but there are lots of common themes that we could pull out of those from space shuttles to Bearings Bank, for instance.

So, what are some of the themes? Because I think the part that really caught my attention is I think you’ve done an activity where you had taken the facts from a different event, mastered it, and told me a little bit about that story in terms of how you mastered the facts that were from an existing element and people thought it was something different.

Yeah. So, the example there was that… I don’t know if readers are familiar with the Nimrod disaster. So, this goes back to 2006. Nimrod was a recollonance aircraft. And shortly after air-to-air refueling, it was on a routine mission over Afghanistan. Shortly after that refueling, there was a fire which led to the loss of the aircraft and, sadly, the 14th service personnel. And I was asked to get involved and advise that investigation. And as I started to read some of the initial information from that investigation, I started to think, this sounded just like another incident I’m really familiar with, which was one of the shuttle incidents the Columbia incident. So I put a presentation together, and on one side of the slide, I put the information from the Nimrod incident, and on the right-hand side of the slide, I put information from the Columbia incident. And then, I went through several of the issues that were involved, and I produced this PowerPoint presentation, and I mixed up the left and right sides, and I didn’t say which was in which. And when we showed it to the investigation team, they couldn’t determine which information came from the incident they were investigating from the NIMOD incident and which information came from the shuttle Columbia incident many years previously. 

It just showed you the two very different incidents in different industries, different locations, and different people, that the organizational issues were almost identical. That was quite powerful, the fact that people couldn’t tell the difference between the facts from one and the facts from the other because these causes just overlap so much. When you look at the very detailed technical level, there are differences between these events. But the common factors when you really start looking at the deeper or, the broader organizational issues, then there are so much many similarities.

What are some of the themes in general that you’ve looked at? You mentioned Bearing’s Bank, which sounds very different than Piper Alpha. What are some of the common themes?

It does. You think, what has the failure of a 100-year-old bank got to do with the failure of an oil refinery or an offshore oil platform or any of the other incidents that we’ve spoken about? People and organizations fail in very similar ways. The findings from these disasters are getting quite repetitive just because you’re seeing the same things over and over. When you look at all of these incidents and pull out some of the main themes, what are the things that we’re seeing? Because the important thing is that we can go and look for these in an existing organization. You see things like a lot of outsourcing to contractors without proper oversight. We call that in the nuclear industry, we call that not having intelligent customer capability because they don’t know what the contractors are doing. They can’t explain what the contracts are doing. Then you’ve got inappropriate targets or priorities or pressures because, in almost all of these cases, there were significant production pressures, whatever production means for your organization. Another key issue that you see almost every time is a failure to manage organizational change. And by that, I mean a failure to consider the impact of that organizational change on safety.

So, a lot of organizations are going through almost like a tsunami of changes and not really considering how that impacts how they manage safety or not considering that each of those separate changes has a cumulative effect which is more powerful than the individual changes. You also see a lot of assumptions that things are safe. So even if you have evidence to the contrary, assuming that everything is safe, rather than going and looking for information, rather than challenging, or rather than having a questioning attitude, organizations are pretty bad at looking for bad news or responding to bad news, not wanting to hear bad news. So in almost all of the incidents that we’ve spoken about, it wasn’t a complete surprise to everybody in the organization. There were people in the organization that knew things were going wrong, that they were getting close to the boundaries of safety, but they couldn’t either get that information to be heard by the right people, or people didn’t react or respond to that. So it’s really interesting when you look, and you read the detailed investigation reports, and there are always people that knew that things were going wrong. So that information is available in the organization.

And I think that’s a good thing because that means that, hey, this is good. We can proactively do something about this. We can go and look for some of these things. So the things that I mentioned there, and there are a lot more, Eric, that we could talk about. There are lots of organizational issues we could proactively go and look for because these incidents are devastating for the people involved, for the organizations involved, but they’re a free lesson for everybody else. Sure.

If you choose to learn from them and if you choose to see the analogy between a space shuttle, Nimrod, and Barings Bank, and whatever industry you’re in.

Yeah, exactly. Because you have to go looking for those issues, for those factors in your organization, so, there are two things or maybe three things you mentioned there. So, you need to go looking at other incidents. You need to take the lessons from those. You need to go and look for them in your organization, and you need to act on that. So, this failure to learn from other industries, for me, is perhaps the greatest organizational failure of all. The organizations think, well, it doesn’t apply to me because that was in a children’s hospital, or that was a bank, or that was an offshore platform. What’s that got to do with me in my industry? Failure to learn those lessons is the biggest failure because you can get away from the technical specifics of the incident and just try and look at the deeper organizational issues. But who in organizations is doing this, Eric? Which person, which role, which part of the organization goes looking for these events and draws the lessons and then goes and challenges their own organization? It’s actually quite difficult to do that. It’s like the problem with safety, isn’t it? Really, is that you can go into a boardroom and you can pitch a new product to a new market, and people give you money, and they’ll listen to you.

But if you go in and pitch that you want to spend money to protect and safeguard the installation against things that may or may not happen in the future is a much harder sell. It’s a problem for safety more generally.

One of the things I know we talked about was around what you call organizational learning disability, so people are good at investigating, but not true learning, and not embedding the change. I’ve seen this many times where people learn the same lesson over and over.

And that’s it. When we have these large investigations into these disasters, there’s always this proclamation that this must never happen again, and we need to learn the lessons. And then something else happens a year or two later in a different industry, but the same issues. So, you talked about a learning disability. Why do organizations fail to learn? Given that, there’s this wealth of information out there available as to why organizations fail. For me, I think there are two issues. I think there’s this failure to learn from other industries. All industries think they’re unique. They don’t think that they can learn because it’s a totally different industry. It’s nothing to do with them. But they all employ the same kinds of people. There aren’t different people working in different industries. They all employ the same people. They organize themselves in very similar ways, and they have the same targets and priorities and so on. So, first of all, that assumption doesn’t apply to me. It’s a different sector. So, failure to learn from other industries, we’ve spoken about, but failure to learn from your own investigations. And we see this in major incidents like NASA failing to learn from the previous incidents it had.

So, you have the Mars orbital and failure to learn from that. You have Challenger, then Columbia, and so on. So, what we find is that there’s a lot of sharing but not enough learning. So, after an incident, then there’s a safety bulletin put together, it goes on the intranet, there might be a bit of a roll, and so on. But you’re not, actually… If you’re not changing something, you’re not learning. So, something in the organization has to change for a lesson to be embedded. And you need to go back and confirm that you’ve changed the right thing. So, you can’t just change something and assume everything will be okay. So if you’re not changing anything structurally in the organization or in one of the systems or one of the processes, then you’re not embedding the learning. So that’s the first thing is this failure to embed the lessons that you come up with. I think the other problem is that investment derogations are not always of great quality. They’re not identifying the right issues. They may not be getting to the root causes. They might focus on human error. They might focus on blame. And Investigations that are done by external bodies generally are starting to look at these organizational issues.

But investigations that are done internally by the organizations themselves into their own events rarely confront organizational failures. It’s very challenging for the investigation team to raise issues that suggest there are failures at the leadership level. It’s challenging for the investigation team, and it’s challenging for the leadership to receive that information. So quite often, the recommendations and the actions are all aimed at employees, a bit like a lot of safety initiatives, behavioral safety, safety culture, and so on, are quite often aimed at the front-line workforce rather than the whole organization. We often see that in investigations as well if they’re not challenging these organizational issues, whether that’s because of a lack of understanding or whether or not that’s not accepted by senior leadership. Because people doing these investigations aren’t always competent. And I mean that in the nicest possible way. They don’t have the right experience, or they’re not given enough time, or it’s seen as a development opportunity. So, investigations need to have the right people doing them, asking the right questions in order to get the right recommendations out of them. Because if the process isn’t right, you’re not going to get the right recommendations coming out of it.

So, what are you going to learn because you haven’t got to the real issues? So yeah, I think there are two issues there, failure to learn from other industries, but also failure to learn from your own investigations. And we can talk about some tips that maybe could help organizations get to some of those organizational issues when they’re doing investigations. Absolutely. And also, it’d be useful to talk about how you can go and look for some of these organizational issues before you actually have an incident, which is what we want to get to. We want to have it, we want to learn, but we don’t want to have incidents in order to be able to learn. So why can’t we learn proactively without having an incident in the first place?

This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety, and safety to your advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at

Let’s start first in terms of how you can identify some of these organizational factors through the investigation process.

Through that investigation process, what you’re really trying to do to get to the organizational issues is you’re trying to zoom out from the detail, taking a helicopter view. You’re zooming out and looking down, trying to see this bigger picture. So, for example, most people who’ve done an investigation would have put together a timeline. So, a list of what happened to who or what equipment and when and draw a timeline and start to map what happened. But the problem is that a lot of those timelines start on the day of the event. And what I’d propose is that your timeline goes back to weeks, months, or even years before the event occurred. You’re trying to identify what might have changed in the organization in that period in terms of changes to equipment, processes, people, priorities, the direction the company was going, and so on. So, your timeline needs to go way back because of the organizational issues that we see in all of these events. These events didn’t just occur overnight. As Reason said in that quote, there was trouble brewing for weeks, months, and years beforehand. So, there are indications in the organization. So, your timeline needs to go back and look for those issues.

That automatically forces you to think not just about the actual incident but more widely about your organization. The other thing you can do really is review previous incidents that have occurred or other sources of data, maybe looking at audits or regulatory inspections, or staff surveys. You’re trying to identify common threads and trends, and you’re trying to identify how long these conditions have existed and how extensive they are across the company. Why did this event surprise us? Because, as I say, the information is normally available in the organization. So why did this come as a surprise? You’re looking not just at individuals, but you should be looking at systems. You should be looking at processes, and your mindset as an investigator should be thinking about what were the organizational conditions. What was the context in the organization that set people up to fail? So that going back way before the incident is quite a helpful change of mindset for people, rather than just going, okay, what happened on this day? And thinking about how you responded to the incident. It’s quite a useful tool to help you think more about organizational issues.

And how broad do you go? Because when you start going back to Zoom out years before decisions, changes in leadership, changes in investment, you can open up a very big can of worms. And I see if it’s Deep-Water Horizon, Piper Alpha, that there’s a need to go deeper. But how deep and how wide do you cast the net? Because I think it’s incredibly important like you said. Otherwise, you just limit to that person that made a mistake as opposed to start understanding what’s changed in the environment, the context. Sure.

It’s a lot easier in those big disasters to do that because they’ll have a huge team of people in these investigations. Some of them have taken five, six, eight years. They have the time and the resource. In an organization, you generally don’t have that much time to do an investigation. Quite often, the people doing it have other jobs, so they want to get back to the day job. So, it’s one of the reasons why the investigations are quite compressed in terms of time because most people are not full-time investigators. So, I think what you can do is it depends on the incident that you’ve had as to how far you want to go back. But I think looking at whether or not those conditions exist in other facilities or workplaces is a useful step that can really help you identify whether this is unique to this scenario or is this a systemic issue that we have in our organization. Organization. I think going back and looking at what might be key issues, so if you’ve had a merger or an acquisition or a major change in your direction or a new product or you’ve opened a new facility, those major organizational changes, if you had a downsizing exercise two years ago and since then there’s obviously been issues in terms of staffing and resources, then those are the key things you need to need to be mapping out.

As you say, you can’t map everything, but you’re looking for key significant changes or events or shifts in priorities or policies that might have occurred in the previous years. And I guess the time and effort that you spend in that partly depends on the consequences or the potential consequences of the event that you’re looking at.

But there’s still an element of you can focus the conversations like you just said in terms of what are the major shifts that happen as opposed to unearthing every piece. You’re still rewinding the movie further back. The other part I think is interesting to explore is what you talked about in terms of how we know and explore some of these organizational factors before something happens. And you mentioned that in all the incidents, you talked about somebody who knew something was up before. So how do we identify these themes before a major event?

Yeah, you’re right there, Eric. I think there’s always information available, and it’s just maybe not getting to the right people, or people aren’t taking action on it. So, these warning signs, red flags, whatever you want to call them, they’re unnoticed, they’re ignored or not getting to the right person because, as we’ve said, these incidents incubate over a long period of time. Those warnings accumulate. And that’s a great thing because that means that we have an opportunity to go and look for them and to find them. So, if you start looking, first of all, you should have a means for people to be able to raise those concerns in an independent, confidential way, some reporting system so that those concerns are coming to you. So that’s like one mechanism is some industries are much better than others at having confidential reporting systems where people can safely report a near miss or an error or challenge or frustration that they’re having. And that gives the organization an opportunity to do something about it. You’ve got to have the right culture for that, of course, because if your previous investigations blame individuals, then people are not going to come forward because they’ve seen what’s happened to other people.

So, they’re going to keep quiet, and these things get brushed under the carpet. So, it does depend on the culture that you’ve got. But having an independent, confidential way for people to raise those issues can be quite useful. So that allows issues to come to you. But you also need to go looking for these issues as well.

Yeah, I think.

That’s important. Organizations have had quite a few events. So, do they investigate them individually, or do they try and join the dots between different incidents? They might appear unrelated, but are they? Are you starting to accept things, either conditions or behaviors, that you wouldn’t have accepted a few years ago? People’s risk acceptance might change over time. Are you contracting more out? And do you really understand the technical work that those contractors are doing? Can you explain it? Can you challenge it if necessary? Are you having lots of budget cuts? The conversation is always around targets, budget challenges, focus on efficiencies, put productivity initiatives, and so on is a really good red flag. Are you starting to focus more on temporary fixes? Are you patching equipment? Are you stretching the life of equipment rather than investing or permanent solutions? Are you may be reacting to things rather than predicting and planning ahead? Now, organizations do lots of safety-related activities, and previous podcasts have talked about safety work and the work of safety. But if organizations start to see the completion of safety activities as being more important as to whether they’re effective, that’s quite often a big warning sign as well.

Companies are doing risk assessments, investigations, audits, and writing a safety case if that applies to your industry. And if the completion of that, if getting that done is more important than using it as a learning exercise and then whether it’s effective, that’s also a bit of a trigger for the organization. So, there are these things you can go looking for. I think one of the biggest things for me is because there are lots of questions we could ask, is that if you assume that your assessment of these major risks is incorrect and go proactively seeking information to continuously revise your assessment, you’re more likely to pick up these issues. Whereas if you assume that everything’s okay until it isn’t, it is too late at that point. Organizations are getting better in their maturity in their approach to investigations. But that maturity hasn’t carried over to being proactive in looking for issues. We’re getting better and better investigations, but we don’t want to have incidents to investigate. In organizations, there are tools or techniques. There are ways you can go and proactively look in your organization to find these issues. The maturity of investigations just hasn’t translated over to proactively going and looking for things.

There are lots of reasons why that might be the case.

I think it’s an interesting point because I think if you’ve got… The other element that comes to mind is if you’ve got an incident that happened, it’s clear who owns the investigation. But who owns this proactive view? Because in some organizations, it could be an audit, but an audit is not always necessarily equipped to do it. I know that in one organization, an audit made an audit in safety, and their focus in terms of driving safety improvement was to find ways to get employees back to the office faster, which has no impact on safety. But from a financial standpoint, if you don’t have expertise in what safety means, that might sound like a viable solution to reduce a rate, right? It could be your safety organization, but that safety organization needs to have the right visibility. It could be some form of a red team that’s constantly looking for challenging pieces. What have you seen be most effective in terms of where this resides and the practice around kicking the tire? Is that what you’ve got?

I think part of the issue there that I alluded to earlier on, Eric, is that I just don’t think this is a formal role within organizations. The departments that you mentioned quite often don’t have the expertise, experience, or time to be able to go and look for these issues proactively. So, the audits, investigations, they’re all quite constrained in their agenda, and so on. So, I don’t think there is a good example that I know of a function in an organization that is proactively going and looking at these areas. You do have risk committees and all these audit committees, whether or not you’re looking in the financial sector or whether or not you’re looking in oil and gas. I think there are pieces of the puzzle held by different people within an organization that can contribute to this review that we’re talking about. But I don’t think there’s really good practice out there of how that’s been pulled together into a cohesive, proactive, challenging go look to see whether or not we have any of these issues, particularly when you’re trying to learn from other industries. So if there’s been a big incident in one industry and there’s a big report that’s come out, and there are lessons and recommendations in that, organizations in that industry might look at that and might go and challenge themselves.

But that’s relatively short-lived, I think. If you ask people in organizations, what are the main failures in Piper Alpha? What were the main failures of Bearings Bank? What are the main failures in the shuttle incidents? A lot of people, including safety people, just can’t tell you what those organizational learnings would be. So not only are they not going looking for these things, but quite often, that experience, that understanding is just not available, Eric. But I think it’s a big gap. I think there’s a role for human factors, people, and systems people to be able to fulfill that role. But it’s very difficult for an organization to fund a position whose role it is, is to go looking for things that may or may not happen or that might be very unlikely to happen. In these times, it’s quite challenging to resource that position in an organization.

A couple of things that come to mind because I’ve seen some organizations do quite well at learning through case studies of others. So as a senior leadership team looking at something like the 737 MAX and what transpired around the box, looking at the Challenger, looking at Texas City, or looking at Deepwater Horizon, and using these as case studies to say, how could this happen here? And driving that reflection because then you’re starting to force this learning out of the industry and push that it could potentially happen here. And the other piece I’ve seen, and I think this is a… You talked about the human factors piece, I’ve seen some organizations that proactively, or maybe it’s every few years, run a safety culture assessment as an example. Now, my challenge with a lot of safety culture assessments is that people will do a survey which will give you no insights into what you’re talking about. But when I’m thinking about a robust one, you’re looking at surveying and speaking to a lot of employees to look about what could go wrong. And you also do a review of system factors. You look at a lot of the practices, the processes, the changes, the things that have occurred over the past few years.

So essentially, you’re kicking the tires on a regular basis at the organization. But what I’m talking about is it’s closer to really kicking the tires, but looking at the system components as well, even though the analysis, because the survey won’t be good enough.

I think you’re right. Organizations are doing surveys; they’re running focus groups. Some leaders will be doing walk-arounds. They’re going to facilities and talk to their staff. If prepared for that, that can be really, really helpful. They’re if you prepare them in terms of what they should ask, that can work quite well. I think these are all activities, and these are all tools that we have available, but I don’t think typically they are aimed at trying to pull out these deeper organizational issues, or maybe they’re not. The different sources of information maybe are not combined to give that overall view. Occasionally, organizations will get an independent organization in to do that review for them, which can be quite interesting. But again, that takes you back to the issue of you having to learn from those recommendations as well. And we have seen quite a few cases where independent contractors who’ve been asked to come in and review an organization quite often temper their findings because they want to get continual employment from that company. And we’ve seen that in some of the major financial events. But Bearings Bank is a good example where the auditors did not see issues, or when they saw issues, were not communicating them to the board because they didn’t want to alert the board to some of the issues that were there, which contributed to the demise of the bank.

So, there were lots of barriers and structural issues that might prevent some of the tools you suggested from working really effectively. But there are tools out there that can be used. We’re making general comments about what we’re seeing in the industry. It’s not to say that there are some organizations that are doing this well. I think it’d be really good to unpack those lessons in learning and communicate those more widely because there are pockets of good practice. I’m not saying no one’s doing anything at all here. There are pockets out there. We need to understand what they are, what is effective, and help to share those more widely for other organizations that maybe are not doing this proactively.

That’s often the tricky part because once something goes wrong, it makes front page news. The 37 MAX makes front page news, multiple investigations, lots of insights, lots of learnings. But does that mean that Airbus, on the other hand, that hasn’t had such a failure, is doing all of this proactively, you don’t necessarily know because they’re generally quieter about it. So, it could actually just be pure luck or actually good practices. And that’s the tricky part.

It could, but it could also be… If you look at an organization that’s had a few incidents or a couple of disasters, people might think, oh, well, actually X, Y, and Z is a bad company. It’s because of them. It’s the fundamental attribution error. If someone is driving poorly, you think it’s because they’re a bad driver. Whereas if you do something, if you cut someone up and so on, then you think, well, there’s all these other reasons why I did that. So, we tend to attribute failures to people because it’s an issue with them not thinking about all the contextual factors that influence behavior. So maybe that fundamental attribution error is something that’s important when we’re looking at these disasters because it’s easy to say, well, they’re just a bad company, and that won’t happen to us. We’re different. We employ different people. We’ve got all these processes and systems, and it won’t happen to us. Risk blindness is an issue for us as well.

I think if you touch briefly on Bearings Bank, the same symptoms that happen in Bearings Bank would probably have happened in many other locations because it’s not that hard to have a rogue trader. The difference there was the size of that rogue trader, but they’re present everywhere. Nab in Australia had three rogue traders on the FX side roughly around the same time. And there are lots of other examples that don’t get reported or get reported on the hundreds page of the newspaper if you really seek to look at them because it’s never a cause for success, but they happen a lot more often than we think.

I think they do. I think you’re right that we pick these examples, and we talk about these big disasters, partly because there’s so much information available on them. And it does become a little bit unfair that we keep going back to the same disasters, but they’re the ones on which we have much information. They’re the ones who’ve been investigated to the end of the degree. But you’re right, there are lots of other failures going on. Not all of them become so high profile. But we do know that lots of other organizations maybe have similar events, but they just, like you say, they don’t make the press for whatever reason, and they don’t become case studies on training courses for the next 30 years. But you’re right. You can pick Bearings Bank, and there would have been several of the banks with the same issues at the same time because they had the same processes or didn’t have those processes in place as Bearings Bank, but it just didn’t play out in the same way. As you know, maybe they had a huge loss, but it wasn’t enough to destroy the bank, and therefore it’s less visible to everybody else.

But you’re right, we’re picking a few case studies here because these are the ones, we have detail on. But it’s not to say this isn’t occurring much more widely than that.

So, Martin, thank you very much for joining me. I think a really interesting series of topics, the link that a lot of organizations relation feels for the same reasons. I think what’s really big takeaway is how do we learn better from investigations and then how do we learn proactively before anything ever occurs? How do we have that questioning attitude on an ongoing basis because it’s too easy to close your eyes and something and think, No, it’s okay? We’re okay. And really, how do you drive that questioning attitude within the business? So, Martin, these are really interesting topics. Obviously, your website, human is an excellent source for insights. Is that the best way if somebody wants to reach you to get more insights?

Yes, certainly. I write quite a lot on that website, so you can go there and have a look. There’s a lot more information on there, or you can follow me on LinkedIn. If you search for Human Factors 101, you’ll find me there on LinkedIn. Please get in touch.


Thank you for listening to the Safety Guru on C-suite Radio. Leave a legacy, distinguish yourself from the pack, grow your success, capture the hearts and minds of your teams, and elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes:

C-Suite Radio:

Powered By Propulo Consulting:

Eric Michrowski:


Martin Anderson has 30 years of experience in addressing human performance issues in complex organizations. Before joining an oil and gas company in Australia as Manager of Human Factors, he played a key role in developing human factors within the UK Health & Safety Executive (HSE), leading interventions on over 150 of the UK’s most complex major hazard facilities, both onshore and offshore. He has particular interests in organizational failures, safety leadership, and investigations. Martin has contributed to the strategic direction of international associations and co-authored international guidance on a range of human factors topics.

For more information:




Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at
Executive Safety Coaching_Propulo

The Importance of Connecting Your Safety Management System (SMS) with Your Safety Culture with Jim Francis

The Importance of Connecting Your Safety Management System (SMS) with Your Safety Culture



“When you center the Safety Management System on the worker and the worker’s perspective, it allows them to have more of a say in the objectives, the goals, the initiatives, and the things that you’re going to go do. It also really started equipping them and engaging them in the solutions.” We’re excited to have Jim Francis, Vice President of SMS Consulting at ENTRUST Solutions Group, join the podcast this week to share his expertise about implementing Safety Management Systems that lead to noticeable and positive change. Tune in as Jim uncovers how to connect your Safety Management System with your safety culture in a way that is relevant to the way your organization functions to reduce risk and produce the most meaningful and beneficial outcomes.


Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy’s success story begins now.

Hi, and welcome to The Safety Guru. Today I’m very excited to have Jim Francis with me. He’s the VP of SMS Consulting at Entrust Solutions Group. We’ve known each other for a little while now. Jim, why don’t you share a little bit about your background and how you got passionate about safety?

Yeah, sure. Good to see you. It’s funny. I have a long history working for a utility, and I come with an engineering and operations background, and most of my career was spent on the compliance side of things. But naturally, when you work in a safety-forward industry in an organization like a utility, you naturally get into the safety aspect of things. My journey really began on the pipeline safety side, with a lot of compliance-related programs and things that we would do to try to improve the performance of our pipelines and reduce risk. Naturally, that connects you to the workforce and the folks that are actually working out there all the time. As my career matured, I picked up more opportunities to work in safety and safety management systems and all sorts of things related to risk and risk mitigation. It was a really good journey, and a lot of things built upon themselves. It took me forward to where I’m at today at the end trust Solutions group, where I’m consulting with utilities and others all over the country on safety management systems.

Sounds great. Let’s go there. Let’s talk a little bit about what is a safety management system and what the main value is.

Yeah. The safety management system is a, I’ll say, structural approach to reducing risk. So, you put a very formalized process and procedures in place to identify and manage risk, really from the worker’s perspective. There are a lot of standards out there by which safety management systems are built and constructed, and it really just starts to define the key elements and the things that really ought to have in place. You need committed leaders, you need to find ways to engage with your stakeholders, you need to find ways to identify and mitigate risk, to validate whether the improvements in the things that you’re making, to communicate effectively with people, to have a process to know whether or not your results are being achieved and the outcomes of your goals and objectives are being achieved. And really, the safety management system puts all of that in a well-defined, constructed approach where those processes all work it interdependently, and just to make sure the system is functioning in the right way and achieving what you wanted to achieve out of it, reducing risk ultimately.

When would you consider starting looking at a journey around a safety management system? Is it something you do early on? What stage? And again, it may depend on the organization that you’re in.

There are tools that, frankly, if you’re starting with something brand new, you could use some of the tools in risk management to try to understand, hey, what am I trying to accomplish here? But generally speaking, there’s no real well-defined starting point with it. It’s more of a question of how your organization is performing. So, let’s look at the results and the things you’re trying to achieve. So, are you having more safety incidents than you really ought to? Are you concerned about the way you’re operating? Do you have inefficiencies in the way you operate? Is your cost structure off? There are a lot of ties to the business functions that might be a trigger to you wanting to implement a safety management system. But ultimately, what you’re trying to do is reduce risk and improve safety performance. So, let’s start with the safety numbers. Let’s start with your charts, your injuries, your incidents, any fatalities, the serious things that might happen to you. And those are really good indicators of, hey, maybe we ought to look at how are we functioning as an organization or as a company to see whether or not we need to be building a safety management system to help us improve ourselves.

And so, you touched a little bit on different models that exist, ISO and Z10 as an example, different models that exist. Is it about the certification, or could you build one in the absence, essentially, of a desire to certify? And maybe what would be the considerations to say, I want a certification, and maybe which one I should take?

Yeah, you know what? I’m of the opinion that you don’t need the certification, and you really ought to not start with that, with the intent in mind, because I think when you start with the focus on, I need a certification, the drivers are likely coming from an external pressure. There’s a regulatory issue, there’s a legal issue, there’s some legislative thing that is driving you to that. Not that there’s no value in those. I think the value of a certification is having a third party validate whether or not the processes in your safety management system are functioning well. Really, the motivation really ought to be about internal improvement in the way you’re functioning as an organization and whether or not you’re driving the safety outcomes that you really want. What’s interesting about it, too, and this is a question that I get a lot, is, if I’m a small company or I’m a large company, is this thing, am I able to do it? Am I able to apply a framework around that? I think the beauty of the safety management system is you don’t necessarily have to do it all. You have to build it for you as an organization and what fits your operations, I’ve seen it where literally somebody can put every single employee they’ve got in a room together, and they can talk every single week.

And there are great advantages to that. And I’ve seen it where there are companies so large that that communication piece becomes challenging. But yet, their system can function for both of them very effectively.

That’s interesting. So, we’ve talked before in terms of how a safety management system can be an accelerator for culture. Can you give me some examples of where you’ve seen that become an accelerator, something that helps business performance on the cultural side?

I think back to my own journey in this, and I’d say it really began in the mid-2010s. We were struggling, frankly, from a cultural perspective. We’ve had to have somebody come in and evaluate where we were, the relationships between us and our unions, and some of those sorts of things. We had some bad policies, we had some bad processes, some things we had to get out of the way. That was led into us building our safety management system. Once we did, one of the beauties of the system and the approach we took was that we were now collecting risks and things that were relevant to the worker. And when you center their safety management system on the worker and the worker’s perspective, it allows them to have more of a say in the objectives, the goals, and the initiatives and the things that you’re going to do. It also really started to equip them and engage them in the solutions, which far too often, I think, sometimes management tips back, and they start to create all the solutions without contemplating the worker.

Too often.

Because they don’t want to pull those guys from their day-to-day jobs and the things that they’re doing. Then what do you see? You get the workers complaining about the new processes and the things that are in place. What I saw, what we experienced was a group of people who are suddenly like, oh, my gosh, they’re listening to me. They’re actually taking my advice. They are prioritizing the things that are relevant to me, and they’re asking me to help with the improvements. They’re asking me to work on the solutions for that. I literally saw guys chasing people from our quality assurance team and our SMS team down on the docks of the buildings trying to make sure, Hey, I got something I want to talk about. That thing was like, Holy cow. It was one of those intangible moments where you go, this thing is really functioning. It’s really working. It was largely based on that.

One of the criticisms I sometimes see around safety management systems is that it’s too much paper exercise. It becomes lots of documentation, lots of paperwork, but it doesn’t necessarily change the experience the employee feels. Tell me a little bit about how you can overcome that challenge so that it doesn’t become purely paper-based exercise.

I think part of it is making sure that you’re right sizing the system to fit your organization. As I mentioned before, it’s got to be something relevant to you and the way your organization functions. Even simple things like how are you going to engage your workforce in the conversation around identifying risk? The mechanisms to do so may not be some big fancy IT system that you’re trying to get somebody to plug something in on their laptop or whatever. It may be, Let’s just sit in the conference room and have a conversation. I think the important piece of it is defining processes in a way that your organization has resources that are dedicated to the exercise of it. The point of a safety management system is to reduce risk. When you take risk management as an example, most of the workforce doesn’t understand risk management. They don’t really care about what a risk register is. They don’t really care about all the processes and the risk matrix and those sorts of things, but you got to have that structure. So build that structure relevant to you and your organization and allow a group of people to facilitate it. And then you engage your workforce in the right way so that it’s meaningful to them. Unfortunately, and I think with any standard, there’s a compliance aspect of it. You have no choice but to have some of the paper pushing and the documentation and the record-keeping aspects of it. Because at the end of the day, you got to prove to somebody that you’re actually reducing risk and you’re in your racing on the right things. But I would say you build the processes that are relevant to your organization that are meaningful and then figure out where some of the other ones fit and how they’re related and whether or not you need something that’s really structured around it or whether you can leverage things that you’re already doing as an organization.

That makes sense. In terms of, what you talked about on the risk register, there are lots of different components of a typical management system. Where is it that people typically find the biggest value or something that they’re not currently doing that really drives critical thinking? You also brought up employee involvement in solutions. What are some of the areas where you’ve seen the biggest improvements?

I think there are probably three or four key areas. Now, one, risk management is the engine that drives the whole thing. But the moment you go into that, you’ve got to start engaging your stakeholders. The stakeholders are not just your workers but it’s also your leaders. The one thing that the system starts to do is it starts to connect those two groups of people into a common conversation. That doesn’t mean they’re always sitting in the room together, but they’re having a common conversation about the things that are most important to them so that, as an organization, they can collectively put their resources toward it. I think that’s where you see a lot of value in that the organization becomes a little more efficient in the way they operate. So, management gets excited about that. They start to see actually injuries and incidents, and other things start to decline. And so, there’s a cost-benefit and that thing to it. And then the workers see the value in terms of the way they start to function. So their processes are more efficient. They’re not spending nighttime hands out collecting data or filling out a form or whatever the simple things are because that becomes a meaningless exercise.

They really start to focus and narrow in on the controls and the things that are going to ultimately make their job a lot safer. Those are the values you start to see. I think those are some of the key processes around it. There are a gazillion processes that seem like they function within the system, but there are just a few of them that play together, and you just need to make sure you’ve got those well-defined, and you understand how to create those relationships in the right conversations.

I think the risk register is one that I see is often missing in many organizations. They could have good back-end elements in terms of involvement of the workforce but then not necessarily focusing on the reduction of the biggest risk. Can you tell me a little bit more about how an organization can improve on the risk register side? What are some of the key elements so that you get what’s the right risk I should be investing in and functions you want to see there?

This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at

The register itself can be a simple tool. Most of the time when we work with clients to develop it, when I did it back in the day, it was just a simple Excel spreadsheet, but it contains the key aspects and the elements of it. Obviously, it starts with the definition of the risk. We always say to define it in terms of the worker. Let them talk. Let them talk about the things that concern them. And ultimately, you’ll figure out how to define that risk. And then, of course, the risk element, there’s a mathematical component to it. And there are typical standard risk matrices and how you start to measure the consequence of the likelihood of those things occurring. But what is important is to make sure that you’re tying actual metrics to that. So, if I said my biggest risk is related to excavation damage on a pipeline, there’s data that tells me or supports whether or not you’re improving or regressing in your performance around that. And you should be able to leverage that data to validate the risk. And ultimately, you have to have some scoring mechanism to calculate your level of risk, so you know, hey, I got to draw a line in the sand, and I could only work.

It’s a prioritization effort, is really what it is. And absent that, that’s what the risk register really starts to do. And ultimately, you start to connect the risk register and the items in there to the further evaluations that you might do through a bow tie analysis or the risk mitigations and the project you’re going to do to improve that. It just starts to tell a story for you, and then it creates the math for you to actually prove to your board or your other stakeholders externally that, hey, we’re actually making progress here.

And how do you handle something that’s an incredibly low likelihood but significant consequence? So just like I started out in aviation, a crash is an incredibly low probability, but the severity is incredibly high, and you don’t necessarily have a ton of leading indicators. Well, I shouldn’t say that. You have a leading indicator on drivers, but you can’t necessarily see if you’re improving.

It’s funny. Far too often, we spend all our time on the lagging side. You wait for the incident to occur before you, and you can’t afford to do that on a high and you can’t afford to wait for an airline crash or something like that.

Or a gas pipeline a burst and explode and take down the neighborhood.

Those are all the things we’re trying to avoid through the context of this. And that’s why I think that’s why having a model doesn’t just pick one attribute. It’s not just about whether somebody gets injured or not. There are other aspects to evaluating a certain amount of risk. It could be an environmental factor. It could be just related to the asset. If I had an asset failure, what would it cost me? It’s a reputational issue. There’s a whole variety of attributes that could be contemplated in your risk register, and you need to figure out the definitions around those. There are standard books and other things to give you a starting point for those definitions, but you make it relevant to your organization and the things that you do. And ultimately, there’s a governance model, and there’s an approach to making decisions around that. So, you present it with the data. Now, I will say the one beauty of a safety management system when you start digging in deeply, and I mentioned the bow tie analysis, the bow tie starts to look at what are those preventive controls to keep that catastrophic event from happening. And ultimately, you start to do your measurement on the leading side, which is within those preventive controls, what are the processes, what are those detection points, what are the things that you’re going to start to identify that might be triggered to that lagging incident occurring, which is what you’re trying to avoid.

So, if you can catch it on the front end on the leading side within the process, you can now go fix it. And spending time within that and trying to understand the connection between the risks that your workers have and the controls in those process points and those measurement points to those things gives them great power in trying to understand, hey, now we got an issue, let’s go solve it once again before that lagging issue happens.

Okay. So, we talked about culture and where you start in the culture maturity journey. How do you implement a safety management system and also make sure at the same time that you’re also improving culture? Because the two should be connected, but they’re not necessarily connected. You could implement a system that doesn’t improve anything culturally, or it could have some blind spots as well. So how do you connect the two, and what have you seen work?

Yeah. So, there are requirements within a typical safety management system standard to evaluate the effectiveness of it. And probably one of the more impactful ways to do that is through feedback. And in many cases, the standard might say, I got a very specific feedback mechanism or approach. You’ve got to find a way to engage. And to me, this is where you start to tie things like your auditing processes or an effectiveness assessment that you might do. But I think the most important piece or one of the more important feedback is a safety culture assessment. Because once again, we talked earlier about, okay, management puts a process in place, and how do the workers feel about it? And if you never ask or you never have the conversation about it. And to me, the safety culture assessment is one way to really get at, we are making headway. Are we making inroads into what we’re trying to accomplish? And it creates an avenue to try to get feedback from that. So, whether you’re doing just a straight assessment. I think, frankly, it’s the post-assessment conversations that probably get you the most value, whether those are small group discussions or individual conversations.

I think having opportunities to engage your workforce in those meaningful things. You should hope to see the results. I saw that at the company I used to work for, we implemented this. We saw improvements not only in our safety culture results, but we saw them in employee engagement results. The two very much go hand in hand with the culture of the company. But those survey results and the follow-up conversations, you get a lot of valuable insight into the way you’re functioning and how they’re engaged and all of the other things that you’re trying to push as part of your system.

I think from the cultural side, one of the pieces I’d say is, a survey is important, but I think where I see is really making sure you’re looking at multiple different elements. You’re checking, you’re watching how the work is performed. You’re focusing on some focus groups to understand what’s behind the themes because the surveys can hide a lot of issues. I can give you a very binary view. I’ll give you an example where people said, yes, I dislike the processes and systems, but it’s not necessarily that. It could be, like you said before, you’re not engaging me in developing the processes and systems.

Great point. I agree. When you ask somebody a survey, are they going to tell you whether or not the safety culture, do I not believe that I work safely? They’re always going to say I work safely. Almost totally. But I completely agree with you. It’s the conversations on the backside of it. You get different levels of feedback and different opinions there that really give you a better insight into the culture of your company.

I think the other element that I think is very connected is trying to get to, and I don’t see a lot of organizations do that yet, but to get to a very local level to start seeing at a safety commitment standpoint, so how the leader is perceived, how they show up and seeing the differences. It is a site. Then working on focus groups, maybe on how we take that actions, how do we take the right actions to address locally, because you can have a common culture, you can have a common system, but leaders have different personalities that show up differently and are perceived differently around commitment and not always aware.

That is so important. It’s funny you mentioned that. I’ve used the story where when my company implemented our safety management system, and people started to get it, and you intuitively knew the good leaders out there, but it was just a notion around it. Then what we saw was the good leaders were the ones that at that very local level were like, I understand the system. I understand how it can benefit me, and I’m going to actually start to execute it. They didn’t wait around for my team or others to push the agenda on them. They just took it upon themselves to go exercise it. Then they engage their workforce in a way. Once again, when you start to look at safety culture results and the feedback, their results are better than their peers. It was that engagement with the right leaders and the people to understand that the system was just something to help them, give them structure to help push the agenda along and to help drive change for them. But that cultural piece, the way those leaders act, really went hand in hand with that. So really important.

And sometimes, people have blind spots. One of the things I’ve seen often is around people saying, Yes, I prioritize safety. And in their mind, they’re saying that because they start the day talking about safety, they’ll have a safety moment. But then they’re going to reinforce, they’re going to give an attaboy to the person who got the job done, irrespective of maybe cutting corners, not consciously, not intentionally, but they give recognition to the wrong behavior. Or the worst I saw was somebody saying, Now, let’s talk about the real stuff as they transition from the safety moment to the other pieces. And those are pieces that then workers interpret saying, well, you tell me safety is important, but it really isn’t.

Yeah, that’s so true. I had somebody who worked for me, and she did an unbelievably great job of recognizing people for the right way. So, we would have workers being engaged in our system. And ultimately, they were the ones that drove out the risk. But we saw the discretionary effort around it. And so, when those things occurred, we were recognizing them in that way for the actions that they were taking for the right things. We were not privy to the production pressures and some of the other things. It was more about whether they were reducing risk, whether their actions were aligned to the kinds of things we were trying to work on and improve upon. And so that recognition went a long way for those folks to start putting pressure maybe on their peers and demonstrating that. And it was pretty powerful in some of those places. Even the frontline employees now, they were the perceived leaders around that within their organization. It was a great way and a very positive way to drive the cultural aspect at that local level.

And so, really, taking away these complementary elements between the safety management system and culture, things you want to drive and evolve in MRL, there may be some cases where you really only need a safety management system. And I think we talked about this before where if you’ve got 80 % turnover, 90 % turnover, including at your leadership ranks, in our likelihood, culture becomes a very hard piece to actually contain. And you need the structure more than ever because you’re just accepting that you have a rotating door, which introduces risk. But in other settings where you’ve got more stability, you probably want to do a little bit of both, and at least you have stability at a leadership level.

Yeah, absolutely. One can certainly support the other. And I do think, depending on… And the turnover is a great example because that should show up as a risk. That’s a huge risk. And that may be the one thing that you have to work on almost entirely in making sure, once again, you got the right structure and you’re onboarding people in the right way. Otherwise, you’re introducing way more risk from a safety perspective than your organization really can handle.

One could argue that if you have 80 % or you have a culture issue, you need to fix it first, or you’re going to see that nobody wants to play in.

Yeah, there is a bit of a chicken and egg with culture or the systems. Frankly, I think you need to just understand your organization and where you need to start with it. One may support the other, certainly, in that relationship there.

Excellent. Jim, if somebody wants to get in touch with you, obviously, the work that you do is predominantly around implementing, assessing, around safety management system. How can they get in touch with you?

Yeah, probably the easiest way is my email at [email protected]. Or check out our website at And there are connections there you can find me. You can find me on LinkedIn as well. Jim Francis, just look me up, and happy to connect and talk to anybody more about this.

Excellent. Well, thank you so much, Jim, for coming and sharing some of your background, your experience around safety management systems, and the value and really to get a better sense as to why and how you should implement one.

Yeah, thanks, Eric. Appreciate the time. Great talking to you.

You for listening to the Safety Guru on C-suite Radio. Leave a legacy, distinguish yourself from the pack, grow your success, capture the hearts and minds of your teams, and elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes:

C-Suite Radio:

Powered By Propulo Consulting:

Eric Michrowski:


Jim Francis is the Vice President of SMS Consulting at ENTRUST Solutions Group. In his role, Jim supports ENTRUST’s clients and the implementation of their safety management systems and other pipeline safety programs. Prior to joining ENTRUST, Jim spent 30 years serving utility customers in various engineering and operations roles at Vectren and CenterPoint Energy.




Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.

Explore your journey with Executive Safety Coaching at
Executive Safety Coaching_Propulo