LISTEN TO THE EPISODE:
ABOUT THE EPISODE
Embark on an exciting journey with the renowned Safety Mythologist and Historian Carsten Busch, also known as the “Indiana Jones of Safety.” Join us as Carsten shares captivating stories with lessons from the history of safety that shape tomorrow’s strategies. Tune in now and be part of the adventure!
READ THIS EPISODE
Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski, a globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy success story begins now.
Hi, and welcome to The Safety Group. Today, I’m very excited to have with me Carsten Bush. He’s a safety professional with a deep understanding of the history of safety, which is a big topic we’re going to talk about today. He’s incredibly interested in the development of knowledge when it comes to safety. Coming to us from Norway. He’s also being nicknamed the Indiana Jones of Safety. I have to ask you before we get started: how did you get that nickname?
Thanks, Eric, for inviting me. Well, fun question. It originated from a Dutch safety professional, who sadly died a couple of years ago, Roland Bakker. He used to work for Shell. And while I was studying at Lund University and was diving really deep into the early writings, safety writings from the early 20th century, and so on, I discovered many very interesting quotes. Either they were funny from today’s point of view, or that they were quite amazing that you would say, wow, this could be written in 2020. I know one today I would be surprised that they were quite ahead of their time already then. I shared those clips, and at some point, Rolande said, well, you’re truly doing archeology here. You are the Indiana Jones of Safety. Well, I love most of the movies, so I thought, well, that’s a cool tagline to have on LinkedIn. I kept it.
Definitely it is. Tell me a little bit about the early history of safety because now we speak about safety on a regular basis, but there was a point in time when people didn’t talk about safety. So, tell me a little bit about some of the origins of how safety came to be part of running an effective workplace.
Yeah. It’s quite fascinating to dive into the subject. Safety has always been something people cared about, of course. Even if you look in the Old Testament, there are some safety rules there about building some railing on your roof so people wouldn’t fall down, and you would get a lot of guilt on your head and so on. And the Babylonians had some safety rules, but I think safety is a profession. Then we go back to the late 19th century. Well, the Industrial Revolution was really going on, and a lot of changes in society, new risks, and so on. And people started getting interested in what we do because they saw bad working conditions. They saw people getting hurt, and people died. And what I would like to focus a bit about, because this is a very broad subject, of course, is the understanding of how, especially for America, insurance played a big role, because, from today’s point of view, a lot of people say, oh, all the early safety work was done by people from insurance. They did this with a monetary goal to make money for their companies, which, of course, is true because insurance companies exist to make money, and they were to make money for, well, stakeholders.
But the early motives of safety, I would say it started first with some social outcry that people were really shocked to see how the situation’s changed and workers, the circumstances and high fatality numbers running in the thousands for railways, for example, which is totally unbelievable from today’s point of view, mining, which was very dangerous, still is a very dangerous occupation, but then, well, mine explosions and hundreds of fatalities and people reacted on that. So, you had this social part, but the social part was not enough. Then, there came some regulation, but very slowly, especially in America. Europe was a bit more regulation-driven in it. And then you had, of course, the humanitarian aspect that some employers realized they had to do something because they had a duty of care for their people. But the real game-changer was financial. When worker compensation laws entered the game, they changed the whole scenery because, before workers’ compensation, the cost of accidents was for the employee. If you got an accident, you couldn’t work, you were home and, well, you didn’t eat, basically, and your employer would get someone else, and well, maybe you could come back when you were well again.
You could sue your employer, but the chances that you won were very low. And especially if there was the slightest hint of some responsibility on your part, like you had done something you probably shouldn’t have done, which there almost always was, and then you had no chance at all to win. And even if you won, probably all the money would go to your lawyer anyway. The cost for accidents was basically for the employee and investing in safety in the early days of safety, I didn’t pay. Then came workers’ compensation, and that changed the whole game because, all of a sudden, the cost of accidents was for the employer. Because the employer had to pay you when you got injured, and you couldn’t work, you still were paid, medical costs were paid, and so on. Employers started to buy insurance against this stuff. That comes the important role of insurance in early safety because it was especially insurance firms with a big safety staff, which they would lend out to employers company owners to do inspections and make recommendations. And then, if you followed the recommendations, you might get a lower premium. So, safety suddenly paid back.
And that was what I think really set in motion early safety. And it’s interesting then to see that also I study a lot of the work of Heinrich, Herbert-William Heinrich, one of the safety pioneers. His breakthrough team was financial. What made him a name was, and we’ve probably all heard it, and it wasn’t his discovery at all. What he did was study it in a more systematic way, add some numbers to it, and then it got a lot of credibility, and people said, Wow! For every accident, there is a direct cost, like the medical care, and you have to pay the injured employee, and you have to get a replacement, and so on. But there’s also a lot of hidden costs that you don’t see, like production stops and there’s disturbance, and people talk about it, and you have to investigate and blah, blah, blah, all that. Heinrich found that there is, at the time of his data set, approximately a 1:4 ratio. People started, well, our safety pays even more. So, if you invest, then that is a big driver for early safety. And not just that, I think there was also a lucky timing. There had been World War I when efficiency was highly recommended because you wanted to produce to win that war.
And there was, of course, the Great Depression, which made it also very lucrative to be safe because that helped you to be competitive. I read quite interesting article that stated that the Great Depression was actually very beneficial for safety because all the workplaces, or not all, but many workplaces that had bad facilities and bad maintenance machinery and so on, went bankrupt. They just went out of service, and they were never used again. People started with, well, better stuff after the Great Depression, I think that there was. I backed it up with numbers, so I thought that that was quite unexpected, actually, because I think from our experience, we often see that things go bad. We have to do things cheaply. Where can we save the cost? Do a bit less maintenance, do a bit less training, and that’s not good for safety.
No, it’s not. From there, one of the things that really struck me when we spoke earlier.
It is really the evolution of systems thinking and where it came about because it’s often associated with a modern view of safety. But what you were describing is that in many cases, systems thinking, there were early elements very early on.
Yeah, what I said, I shared a lot of those early insights, and I wouldn’t say that it was really system thinking yet, but yeah, okay. You see early seeds of stuff that people probably weren’t quite ready for, so to speak. But you see insights which you just can’t copy now to the 2000s and say, this is what we are dealing with now, or that’s what people with a newer view like Hufnagel or Decker and so on are saying, but you find similar stuff already in the 1920s.
So, the new view of safety is not new.
The new view of safety is a different new one, I think. I would say Heinrich and his contemporaries were a new view at the time that I revolutionized quite a bit. You had the first wave of safety, Pioneers, and I won’t bother you with the names because nobody knows them anyway, but they were very focused on machinery and guarding and that stuff. Very basic safety work. There were a lot of low-hanging fruits at the time. And you see that the first safety books around 1900 till, say, the First World War, they were very much how to create safer workplaces by illumination, by guarding ventilation and a bit of organization too, but very little. And then, in the mid-20s, there were some safety thinkers like Louis de Block, our first Vice President of Safety of Dual, the big chemical firm, who wrote a ground-breaking book, and Heinrich drew a lot on that work. Heinrich specifically produced a more management-oriented framework. So not only how to guard machinery, but also how to build an organization, how to learn from accidents, how to better investigate accidents to approach safety in a more scientific way, so to speak, a fact-based way looking at what is actually happening and where should we focus and then not just have a blanket approach.
That was really a new view at the time. Now, looking back 80 years later, we say, Well, that’s just traditional safety. We’ve been doing that for decades. Now, we need something new, and we shouldn’t just focus on what is going back, but we also need to focus on stuff that’s going right, especially at normal work. And then you look back in the old books, and you find already nudges of that, which is quite fascinating. I have a very cool quote here.
I would like to read it to you, and it’s by a guy with an interesting name Albert Wurz-Witney. And he was quite a hotshot in insurance as most safety pioneers of the time. And I think this is from something he wrote in 1921, and he called it This was his philosophy of safety. And I quote, Now, life is intrinsically dangerous. Life is partly routine, to be sure, but more fundamentally, it’s an experience of the unknown and, hence, based on adventure. It’s quite fascinating. He stresses that unknown uncertainty, which is only in recent years getting the role in risk through ISO 31,000, but he stresses already the uncertainty here. And then I find it fascinating that it says, well, life is based on adventure, and that’s cool, that’s risk seeking. He goes on to say the prime quality and safety, therefore, is not the removal of danger but it’s the improvement of the quality of the adventure. That’s wow, this was 1921. And this guy is saying safety isn’t about prevention. Safety is about having better adventures, which I think is absolutely in sync with the resilience engineering thinking that speaks of, well, we need to prevent, of course, and reduce risk and control hazards and all that, but especially, we need to be better at handling variability, which very well resonates in my head with having better adventures to succeed better.
I think that’s such a lovely quote. This guy was very much into getting safety into education. If I understand well, he has been very important in getting a role in American schools and, afterward, the creation of safety courses and a non-university of two. But that systems view might have been discussed, but in terms of its starting to be operationalized, it is definitely a newer function.
Yeah, it has taken many decades, I think, to mature that this is broadly recognized because you find those Nuggets. But safety culture is the same. Du Block, the guy I discussed earlier in his book, discusses what we would now describe as safety culture, but he uses a different term, safety atmosphere, and he has a definition that is quiet. I think it’s very usable because he speaks of some invisible force that affects even people who are entirely new in the company, which is what culture does or is. We won’t get into that discussion now, but it would take until the late ’70s that Culture as such entered the safety discourse.
This episode of The Safety Guru Podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, re-energize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com.
You touched on this earlier, Heinrich. Let’s revisit Heinrich because he’s often quoted with the pyramid, which is now disputed. But the pyramid is not, as you described it to me, is not quite what it was intended to be. Tell me a little bit more about Heinrich and the pyramid and some of those elements.
Firstly, I never speak of Heinrich and the pyramid because Heinrich never drew a 3D shape, which is a pyramid. The pyramid is three-dimensional. Heinrich made a triangle, and it was only Bert who made the fancier picture. But there are a lot of misunderstandings. I think it’s Heinrich’s at least one of his most famous concepts, but has started a couple of lives on their own. People are treating it, for example, as a law. Some people still believe, even today, that there is some natural law dictating that there should be one accident with major consequences, 29 accidents with minor consequences, and 300 accidents with no consequences at all. I’ve actually been in a meeting many years back now where someone and I were discussing reporting and underreporting. Someone stood up and said, Well, I’m sure we are having a problem with reporting in our company because we don’t live up to this ratio. I thought, Oh, dear. Oh, dear. Oh, dear. Because if you study Heinrich’s work, you will see that… And he’s quite clear about it. He says often this is an average. And then, if you see how he got to this average, you will see that he studied different kinds of accidents, where they then found and estimated the ratio for that accident type.
So, in his book, he has, for example, a case where somebody is cutting wood on a circular saw, and he pushes the wood through the saw, and at some point, he does it without a push thingy, and he cuts his fingers, and then they find this ratio one to something. I don’t know by heart. And then, he has another example where he describes someone on a daily basis crossing rail tracks because it’s a shorter way to work. And then, he finds a ratio of one severe accident to several thousand incidents without any injury. And then, he has a couple of various scenarios that he describes, and then you can see that for each scenario, there’s a different ratio. Then he averages them up, and he finds or constructs a very neat number, 129, 300, which is a ratio that you won’t forget. That’s a stroke of genius. He repeats this number, and he anchors in the message that there are a few series of serious accidents. There are more not quite as serious accidents, and there are a lot of near misses in modern language. He doesn’t use the word Neomi’s yet.
He speaks of no injury accidents. Then he says, after having anchored in this message, he says, here lies great opportunity. If you recognize that you could have prevented or if you just recognize you can prevent worse stuff from happening, then you can act proactively. That’s his great gift to safety, I think. This realization is that we don’t have to operate reactively. We don’t have to wait for someone to be injured. Of course, if somebody gets injured, then there is a greater sense of urgency that we should do something to prevent this from happening and make improvements and so on. But his message was you don’t have to wait until somebody gets hurt. You can actually be observant. See that… Sorry. Oops, this could have been much worse, and you can react to it. His message was actually one of opportunity and not one of counting, which a lot of people make. A lot of people say it’s metric. No, it’s not. It’s just a metaphor for opportunity and proactivity. And that’s one of the things that a lot of people credibly get wrong. They start counting, and then they mesh all kinds of accidents together. A lot of the literature on the triangle or the pyramid is based on either within the sector or within the country.
Or if you are lucky, whether within some process, but they don’t stick to one specific scenario. Because if you want to play with numbers, you have to have it within a scenario, which is very similar, which also said this just works within. He calls it a unit group of similar accidents.
If you start blending together, and this is one of, I think, Todd Conklin says it a lot, ankle sprains don’t say anything about wall blowouts. Of course not, because they’re two very different types of accidents. One is not predictive of the other. But slippage may be very well predictive of broken legs and ankle sprains, and bad maintenance or mechanical failures in your blow-up preventer are likely predictive of the well blow-up. But don’t mix them together. There are a lot of misunderstandings that people focus too much on the numbers and the correctness or the ability of the ratios, which are totally irrelevant because they’re just an illustration. People think that it’s predictive. If you have had 299 and you miss, then probably next is up, which is quite foolish, actually. And Heinrich himself said, well, it may be also the first where you get hit. Sure. There is a factor of randomness there. One of the main mistakes is that people don’t stick to the scenario. The predictive element, if there is one, is only within one scenario.
I think the other piece I’ve seen is a lot of organizations start relying on that pyramid or the triangle, as you mentioned and start thinking that if I focus on very small injuries, I’m going to reduce serious injuries and fatality. They’ll focus on the same amount of attention on first aid or a B-sting or a slip, trip, or fall versus the elements that will drive a serious injury and fatality are probably quite different. That’s where I think there’s been more recently a shift of thinking and realization that it’s a subset of those that can drive to serious injuries.
Definitely, There isn’t one pyramid unless you want to calculate the average, which is fun to do, maybe for safety nerds, but it’s of no practical value. You have to see the pyramid or your pyramid as a huge stack of different pyramids. You have a slip, trip, and fall pyramid. You have a well-blow-out pyramid. You have a paper cut in the copy room pyramid, which probably has a crazy ratio of one to a trillion or something. And then you have pyramids, which aren’t pyramids at all. I’ve worked for 20 years in railways, and I used earlier this example that Heinrich mentions of somebody getting hit by a train. There isn’t probably a pyramid shape there. It’s probably some hourglass shaped where you have fatalities at the top, then almost no minor injuries, because if you get hit by a train, you typically either it’s a tiny or a miss, which there are a lot of, or you are probably quite damaged. There’s not a lot of first aid in those cases. It’s big on the top, and it’s big at the bottom, and nothing in the middle. Out. That’s the hourglass.
Sure. I’d love to pivot to James Reason and the Swiss cheese model and love to hear some of your perspective from a historical standpoint.
James Reason is one of the other safety artists who has a really brilliant metaphor that anchors and you see it and I think most people can intuitively connect to it and make sense of it and then give their own interpretation of it. And that’s what we, for example, have seen in… I don’t know if you’ve seen them, Eric, but I got quite fed up with COVID times. All the pictures shared of these COVID protection Swiss cheeses, where you had 17 layers stacked up and then people having some a story around it. It’s quite interesting to reflect a bit on how this happened. Because these COVID-19 Swiss cheeses, I think, they stray quite far from reasons idea. I think they’re the three main categories of why people are getting models wrong. The first is they just don’t know better. Sure. And for whatever reason, and then we can talk an hour about this, I guess, but I think here, say, is one big factor. You’ve had a course, and somebody told you his interpretation of Swiss cheese, and then you pick up some parts. And then we basically go to the second reason.
You start making your own interpretations. That’s a quite powerful one, and it’s for the better and worse. Let’s just be clear about that because if you see this picture, a couple of barriers with holes in them. And if something goes through the holes, then things have gone very wrong, and you have an accident. That makes immediate sense, I think, to a lot of people. And especially the lot of barriers make a lot of sense to many people. And then I start ignoring them. Probably, they don’t even know that the message or James’ reason was much more complex than just the picture because the picture comes with a lot of text and a lot of explanations and pathogens and complex systems and organizational factors and human factors and you name it, people just see the picture. They think, Well, how can I use this in my situation? And then they start, well, just take the picture, give their own twist, and that’s all there is to them. And then there is a third group, which are the people that have motives of their own by interpreting a model their own way. The Swiss cheese model has gotten a lot of bad rep in the latter years, especially, let’s call them, New View Safety Thinkers, who call it the linear model, which I would say that that’s not correct at all.
The picture looks linear because you have all these slices stacked after each other. But the picture isn’t all there is. If you read the text and go back as far as the first presentation in 1990 in Human Error, the book by James Reason, and you read the text that comes with then he says quite interesting stuff like the holes in the barriers. They’re not static. They’re moving around. They open up, and they close, so they change shape and so forth. The model is not linear at all. The model is quite dynamic, even though the picture looks very linear. Right. I think some people also give a twist to models to make their own message look better. You come down with a different model, and I won’t name any, but which perhaps takes better care of the systemic factors, and then there are models that do that quite well. And if you contrast it to Heinrich Domino’s or the Swiss cheese, which looks very linear, then your model probably looks better. And there may even be, let me say, a pedagogical angle to it that you stress the linear aspects of the Swiss cheese to communicate better about the systemic approaches.
Sure. Yeah. That’s just some quick reflections based on the Swiss cheese and then how people give their own twist to it. Then one thing I would like to stress and something we perhaps need to work on is I started by saying a lot of people don’t know better. They’ve been on the courts, or they’ve seen a presentation where somebody had 15 seconds to say something about the picture on the slide behind them. You get this quick explanation of the Swiss cheese, and you think, well, this is a quite knowledgeable guy standing there. He explained this, and then this is all there is. And we’re not typically trained to ask critical questions and then go back and read the literature and study it. And we probably don’t have time either, a lot of us. But you may end up with a model that doesn’t quite do what actually you should be doing.
So, thank you, Carsten. You’ve shared a lot of background history from the early days of safety to some early thinking around the broader systems view around safety to revisiting Heinrich and Swiss cheese model. I think there’s an important lesson in what you’re talking about in terms of models. At the end of the day, the model is there to share an idea and a concept. I think that’s the important element of the model, as opposed to thinking it’s pure and true and depicts everything. Is that a fair assessment?
Yeah, that’s a good summary. Models are always a simplification of something, of course, and we have to understand the limits of that simplification and the limits of the model. We need to ask a bit more often, probably, firstly, is that what the model was designed for? And then we can, of course, use it for other stuff that may be actually beneficial because innovation builds on that. But it’s wise to check: Is this actually what the model was meant to say? Because the Swiss cheese is not about 17 layers. 17 layers may actually be safer than ten layers because new layers introduce new complexity and side effects, and so on. Go back to the source and ask a critical question at least once in a while. I think that’s important.
Thank you very much, Carsten, for joining us. If somebody wants to read more, hear more, how can they get in touch with you?
I have a website, www. Mindtherisk.com, and there’s probably a contact somewhere there, and you can find me on LinkedIn. Relatively active there, so just reach out and connect.
Excellent. As the Indiana Jones of Safety. Thank you.
Thank you for listening to The Safety Guru on C-Suite Radio. Leave a legacy. Distinguish yourself from the past. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafetycoach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.
The Safety Guru with Eric Michrowski
More Episodes: https://thesafetyculture.guru/
C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/
Powered By Propulo Consulting: https://propulo.com/
Eric Michrowski: https://ericmichrowski.com
ABOUT THE GUEST
Carsten Busch has studied Mechanical Engineering, Safety, and Human Factors. He has over 30 years of experience in Safety and Quality Management at various levels in organizations ranging from railway to oil & gas to police in The Netherlands, the United Kingdom, and Norway. He is professionally active on various forums, a regular speaker at conferences, owner of mindtherisk.com, a tutor at the Lund University Human Factors and System Safety program, and author of several professional books: Safety Myth 101, Veiligheidsfabels 1–2–3, If You Can’t Measure It… Maybe You Shouldn’t, Preventing Industrial Accidents, The First Rule of Safety Culture, Risicoflectie, and recently an annotated republishing of safety pioneer Heinrich’s papers from 1923-1945. His main research interests include the history of knowledge development and discourse in safety, which has led to Ph.D. work through Open Universiteit. He is an active member of the Dutch Society of Safety Science (NVVK) and a member of the editorial board of the society’s quarterly magazine NVVK Info. He is a reviewer for Safety Science and the Journal of Contingencies and Crisis Management.
For more information: https://www.mindtherisk.com/
EXECUTIVE SAFETY COACHING
Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.
Safety Leadership coaching has been limited, expensive, and exclusive for too long.