The Impact of Leaders and Their Decisions in Improving Safety Culture with Dr. David Hofmann
LISTEN TO THE EPISODE:
ABOUT THE EPISODE
“Culture comes from the top and is enacted from the bottom.” Dr. David Hofmann has been researching safety climate and leadership for over 20 years and joins the podcast this week to discuss the multi-level aspects of improving safety culture and the daily micro-decisions leaders make that in turn affect safety performance. Tune in to learn strategies for leaders to personalize safety in a tangible way, foster trust, and reduce psychological distance from the frontline.
READ THIS EPISODE
Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is The Safety guru with your host, Eric Michrowski, a globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy’s success story begins now.
Hi, and welcome to The Safety guru. Today I’m very excited to have with me Dr. David Hofmann, who’s a professor in organizational behavior at University of North Carolina in Chapel Hill. He’s a researcher who’s applied extensive research to safety, safety culture. Dave, really excited to have you with me on the podcast today.
Thanks, Eric. Glad to be here.
Let’s first start. We’ve touched on the topic of safety culture in the past on the podcast. Would love to hear some of your perspective around what you call multi-level aspects of culture.
Let me just give you just your listeners a little bit of background on me. I am an organizational behavior faculty member, PhD in organizational psychology. I’ve been studying safety, climate, leadership things for about 20 years now, plus 20 years plus. And where we think about this multi-level aspect of safety culture is culture comes from the top and is enacted from the bottom. The way I think about culture is you have the espoused culture of the core values and the key assumptions and then the org structure and the artifacts as well as the metrics and all those things that are coming from the top. And then at the bottom and the middle of the organization, this culture gets enacted day in and day out. I’ve written a little bit with a friend and colleague by the name of Dove Zohar about these micro decisions that frontline and middle managers face every day. And often those micro decisions involve competing priorities. And those managers have some degree of discretion in terms of how they prioritize one of those priorities, a little redundant, over the other. And over time, as I watch, as an employee, I watch these micro decisions getting made every day.
And if cost is always just a nudge higher than safety, or schedule is always a nudge higher than safety, so it always went out in the end. Then what happens as I watch these decisions is that I get a gestalt impression about what’s really valued, expected, rewarded, and supported in the organization. And we call that the enacted culture. And so, then you can start thinking about the enacted culture coming from below. And then it intersects with the espoused culture coming from the top. And then that’s where in the middle you see the gaps between the espoused and the enacted culture. I think this is something you see very regularly and sometimes I think almost happens. It feels like it’s happening unintentionally. I was talking to a group not long ago and they were talking about recognition, and they kept recognizing examples. They had a recognition where people that worked the weekend, people that worked extra hours, which again reinforces productivity. And when it came to reinforcing or recognition around safety, it was, thank you for doing that job safely, but really, are you recognizing safety or are you just saying you came back and you weren’t injured, but you have no idea what happened and how the work occurred?
Yeah. It’s the absence of an outcome gets recognized as opposed to the presence of proactive behavior that really drove that outcome to be a safe manner. I see that quite a bit is that there’s this notion of the absence of something means we must have done something well. And it’s like, well, maybe, maybe the absence of something, it might be the absence of something means you just got lucky. Correct. I don’t think people make that distinction very often. But in this instance, you’re hearing constantly this message around getting the job working harder productivity, not somebody saying, get this job ahead of safety, but it still sends that message if I’m hearing you correctly.
Yeah. Well, at the end of the day, if you want a safe organization, they should do absolutely nothing. There is this notion of there is risk in many of the industries that you’ve worked in and the industries that I am familiar with and where I do my research. There is this notion of there is going to be some risk that you must really manage. But I think this notion of thinking about safety as a bit of a dynamic non-event is something that I’ve spent some time thinking about and talking about as well. And this, probably the most recent example I talked about this was I was asked to do a presentation to the California Public Utilities Commission, a public hearing, and they called me and asked me to just kick off the day with a talk on safety culture. And one of the things that this model I’ve been working on and doing some research on with some of my colleagues is if you think about safety and cybersecurity and several other types in the risk domain, they’re that we would term a dynamic non-event, which is you work hard, so there’s a lot of dynamic behavior going on.
But at the end of the day, if you’re successful, then nothing happens. If cyber security is successful, then you did not have a breach. If safety is successful, then you didn’t have an injury. And I know my safety professionals listening would say, Well, that’s not right. There’s a lot of things happening. And I hear you. I can hear the listener saying that I agree. But if you think from a non-safety professional practitioner perspective, they think about these as dynamic non-events. And so, one of the things I highlighted in this presentation of the California Public Utilities Commission is it’s the middle managers that really have to prioritize budgets and funding and all of that thing. And this was the example I used. If I put a dollar over here in this investment, then I know I’m going to get a dollar, depending on what my internal rate of return is, a dollar tin back. And if I spend a dollar on safety or cybersecurity, or in this case, tree trimming or repairing lines, then nothing happens. Well, I’m left as that manager with the idea of, well, what if I would have spent 95 cents on safety?
Would nothing has happened. And then I can put a dollar five over here and make a little bit more of my return. And that’s where my metrics are. That’s where if there’s a bonus structure, that’s where the bonus structure often is. And those performance metrics are measured every single month, week, quarter. And the safety metrics are a little bit long. So, it’s really easy for me to just turn this little dial and say, well, let me invest 95 cents over here in safety. Put a dollar five over there. Nothing happens. It’s like, well, maybe I can do 92 cents this year. Nothing will happen. And what happens then is those managers think that they’re actually learning because they’re updating their model. They’re like, oh, what I learned is that you only have to spend 92 cents on safety or tree trimming or cybersecurity and nothing will happen. And I think that’s really a false notion of learning.
Is there something as well there in terms of… You mentioned when we talked before in terms of the psychological distance between the decision and the outcome. Can you expand maybe a little bit on that front?
Yeah. So, this is some research that came from I served on the National Academy of Science Committee that was charged with investigating the BP Deepwater Horizon accident. And our charge was to go up until the moment the accident happened. None of the recovery efforts. And one of the things that we did is we went to an oil and gas company’s onshore command center for offshore drilling. Sure. I’ll say that onshore command center for offshore drilling, for those folks that are driving in the car or something. So, this is in Houston normally. And you see it’s a quiet office park, office. And you have seven or eight computer screens in front of you, and the person is there just monitoring offshore drilling, drilling operations that are happening for 500 miles offshore. And I just was struck by that environment. The other thing that we did as part of that committee is we flew out to an offshore oil rig. And so, you could get a little bit of a contrast of what does it look like to be on the oil rig thinking about safety issues versus 400 miles away? And that started me thinking about this notion of in social psychology, there’s a whole body of research on Construct level theory.
And Construct level theory just basically says how psychologically distant are things from you. So, in that sense, to put some flesh on the bones of what that concept means before your listeners fall asleep is that you could think about, if I’m in Houston watching drill operations 400 miles away, that’s a very distant, psychologically distant thing. Where if I’m on the rig with drill pipe and everything, it’s very psychologically close. It’s very concrete. So, the research shows that things that are close up, we conceptualize in very concrete terms. It’s very the how we do things versus the why we do things. And things that are way off in the distance, we construe at a much more abstract conceptual level ethics and values.
For example, core values, religious belief, ethics are often consternated at high level, very abstract level of these abstract principles. They’re constrained abstractly because we want them to converse time so that we can apply them in different situations. Anyway, I was thinking about that notion of concrete versus abstractness. Fast forward now, any number of years, we finally have a research paper with about five or six studies where we show that if you construe a work context is psychologically distant, you view safety as less of an ethical moral obligation.
And that’s in part driven by the reduced perception of harm. Sure. So, to put it in real practitioner terms, if I’m watching drilling operations happening 400 or 500 miles away, the realness of those people and their potential for harm just dissipates in the background. And then you add to that that I communicate with those folks through chats and coarse communication modes, often not even video feeds. And again, this becomes faceless people, and I’m less likely to view ethics as a moral obligation.
So, what are some strategies organizations can drive to address that element? Because I could see that as well happening. You talked about the onshore command center. Imagine it can also happen at the C suite level. The more you remove from the front line, you can feel further removed. What are some strategies organizations can do to try to mitigate on that?
This episode of The Safety Guru podcast is brought to you by Propulo Consulting, the leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com.
Yeah, I think you’re exactly right that there are any number of different ways in which psychological distance can be operationalized. And so, in the study is actually one of the ways in which we operationalized it just to give you a sense of this. So, one was we had offshore drillers view aerial pictures, photos of their drilling rigs versus photos of the drill floor. There were no people in any of the photos because we didn’t want to confound this notion of potential personal harm. And what we did is just randomly assigned people to the aerial photos. I see a helicopter view versus a drill floor view. And we asked them to what extent are these 25 safety behaviors a moral, ethical responsibility? We found significant differences. But we also operationalized that by having a group of nurses tell us what city they worked in. And then we coded into the subsequent survey. Either they were a manager in a hospital in the city which they said they worked in, or we randomly picked Miami as a as a place that’s far away from most everything, except I guess Fort Lauderdale. But we can control for that in the study. And then we said, Imagine you’re an administrator in a hospital in Miami. Now, think about the amount of potential harm that could happen and the extent to which these safety behaviors are moral and ethical obligations. And we find that if you’re thinking about being an administrator in a hospital that’s a thousand miles away, you think about it in a different way. Research has also shown that organizational level, as you go up management different levels, you think about things in a more abstract way. So, you’re right in the sense that leaders often construe things in a little bit more abstract way. There’s a couple of implications for that. First is that they see tighter connections between things. So, there is this notion of, well, if you do things safe, then you’re also going to be highly productive. And I think that these two things can go together. Safety and quality. For example, go together. Can go together. And I would say, over time, I think there is some truth to that. But day in and day out, and so the leaders see these things, these concepts as very abstract, which means they can see tighter relationships to these things. But then the frontline managers, they’re faced with a real concrete decision this afternoon of we can either pause for three hours to try to get this part, or we can do a makeshift thing and be back up in 15 minutes, and they see those things competing. So, what do managers do? What do they need to do in terms of practical implications? I think first, they need to continually remind themselves of what the work really looks like on the front lines. And I don’t mean remind themselves, like, remember when they did it 15 years ago or 20 years ago. They need to get some exposure to how is it done now in a much more dynamic, competitive cost pressure environment than maybe they faced 10 or 15 years ago when they did it. And part of it is reminding themselves of that. I think day in and day out, a symbolic reminder of the harm that can potentially occur.
Really thinking about your employees and getting to know… You can’t get to know everybody in your organization if you’re running a big organization. But boy, to the extent that you can really know some people on the frontline supervisors who really are facing this harm, so they become real people and you know something about their families and their children so that you think about, oh, if something bad happens, that’s Jim or Sue. That’s not just some random person that I have passed on the plant side at some point.
That element of personalisation, I think, is I remember in the customer experience space, people would often say, if you put people actual pictures in a call center as an example of your customers and you remember, who am I here for? I’ve seen some organization in the safety space do similar areas where they put actual pictures of team members doing the work and encourage more regular visits to frontline work to understand, to listen, to understand how their work impacts a perfect day for them so that gets more proximity.
There’s some research where they have given health care professionals who are reading radiology, for example, or something similar to that, a distant person that’s just sitting in an office reading X rays or looking at blood samples or something, a blood test. And what they did is they randomly attached a photo of the patient to the file. And when they attached the photo to the patient, the read was more accurate because it became a real person. So, I think all of those things are really good. We actually opened and closed this research paper with the Canadian iron ring, which you may know something about, the iron ring, the Ceremony of the Iron Ring for Engineers, is that when you become a licensed engineer in Canada, you go through this iron ring ceremony and you wear a little ring on your, I guess, your right hand little finger, I think, to remind you of the ethical moral responsibility of a professional engineer, there’s an apocryphal story about those being made from the bridge collapse in Montreal. It turns out that’s not really true. Maybe once upon a time it was true. They ran out of metal. But that notion of this constant reminder of decisions that I make at the drafting table have downstream consequences. I think anything that you can do to make sure that that abstract notion becomes is always salient and particularly around the potential for harm would be beneficial. And if I touch on the example that you shared before in terms of the middle manager making a trade-off, I take $0.05, maybe I take $0.07, an extra $0.02. What are some strategies to mitigate that? Because it sounds like it would just be in the story in the news around the incident in around the derailment, sounds like it was, we took, we took, we took until eventually the budgets run out and something went too far. Obviously, we don’t know yet the full conclusions, but the early signs seem to be that their budgets kept being cut until it was too much. I think that’s a common story, actually, unfortunately. And I think it’s common in part because the dynamic non-events are this abstract phenomenon that are hard to imagine and therefore easy to discount the likelihood that something bad is going to happen versus a very concrete metric that you’re held accountable for every quarter for delivering or even shorter on delivering the product. So, a couple of things come to mind. I wish I had this is my next 10 years of research to try to sort this out. But I think the first thing that I would recommend is to understand the difference between what I would call real learning and superstitious learning. Now, real learning involves the reduction of uncertainty, that you were missing information, some degree of uncertainty, and that uncertainty has been removed in some substantive way. Superstitious learning is probably not that familiar. That definition of real learning, I think people are like, well, yeah, that makes sense. But what is superstitious learning? Superstitious learning goes all the way back to Pavlovy in psychology. And superstitious is defined as an incorrect pairing of a stimulus and response. Okay? Okay. So, when I take those five cents away from the dollar and nothing happens, I conclude that I have learned that I can spend 95 cents, and nothing happens. And it’s like, no, you have not reduced any uncertainty in that equation. And it’s very difficult to do that because those kinds of things, like you cut the training budget for whatever safety protocol, or you cut 10 % of your tree trimmers from a utility company, that decision is not going to manifest into demonstrative risk for sometimes many years. And by that point, all the middle managers are off into different jobs, and nobody remembers. So, you can’t really connect the two. So first, I would want to say, to what extent are you really learning? And do you understand what learning really means as opposed to just getting lucky? And so, a lot of times you’re cutting these budgets and you’re just it’s just there’s a really long feedback cycle and it’s fuzzy. And so, nothing’s happened, even though the risk is continuing to accrue. But you’re concluding that you’re learning that you don’t need to spend as much on safety and nothing will happen. So, the first thing I think is just really for people to grapple with this notion of, if I cut this budget, am I really learning anything given the flow feedback cycles, the stochastic nature of that, the fuzzing of the criteria, etc.
So, there’s a lot of ways in which you’re not really learning. I think the second thing goes to this notion of a really strong safety culture throughout the organization. Another piece of research that was done by my friend Dove Zohar with another one of his colleagues, showed that if you have a really strongly agreed upon and strong safety culture at the top of the organization, then it actually reduces the amount of discretion that mental managers enact with respect to safety. So, safety becomes a nonnegotiable. So that’s where it loops us back to the beginning of our conversation around safety culture, is that you have a really strongly held view at the top of the organization that safety is an extremely important criteria. And that’s strongly held, symbolically reinforced, top of the organization talked about, communicated about, invested in, so people see not only the words, but the actions behind the words, the money behind the words. And that my job is designed to be safe. My manager is talking about safety, then all of a sudden it reduces the amount of perceived discretion that I have. And so, I’m going to be less likely to take that $0.05 and move it from the dynamic non-event into the other criteria.
The two things that come to mind is really getting really clear on what organizational learning means. And then forcing people to justify it, like, oh, if you’re going to cut the budget this year and you don’t think it’s going to be risky, how do you know? You’ve got to give me the criteria, the data that you’re using, the assumptions you’re making. And then secondly, I think just creating that really strong safety culture throughout the organization to reduce the amount of discretion that those frontline and middle managers perceive that they have with respect to safety.
I think it’s an important point because what you mentioned, even at the top management team, I’ve seen very mature organizations where even when somebody say, I could save X amount of money in my budget, finance will say, well, what would be the impact on safety? Help you think through, because sometimes the impact is… It’s not just cutting the safety budgets, not cutting the training budgets, not just taking your PPE out. Sometimes I hear now of examples of, in 2008, we didn’t recruit for two years, and as a result, we lost some learnings as people retired because we didn’t create the next generation. And we’re now 14 years later, and people are starting to realize the effect of a hiring freeze that happened in 2008. And so, it’s really trying to think about what could go wrong from these pieces that are not necessarily a safety budget. This was just a recruiting budget, promotion budgets in an organization.
Yeah, that is a great series of questions that would go a long way to fleshing these out. And then trying to make, in reverse, connect some of those dots so you do learn from them to say, oh, we did cut that training budget. And now, five years later, six years later, when we’ve got to expand operations, we don’t have people trained up to do it. And this needs to be a lesson learned. We need to do an after-action review. We need to file some learnings with the senior managers so that we can act on it and continue to move forward positively.
Absolutely. So, Dave, thank you so much for sharing those examples. I think they’re very powerful examples of safety culture, the role of leaders, and how you really instill those right decisions, both in terms of the concept of the proximity you talked about in terms of the onshore or offshore locations, but also in terms of the role of leaders and the decisions that they’re making day in and day out.
Well, thank you for having me. I’m so happy to be here.
Thank you for listening to the Safety Guru on C-Suite Radio. Leave a legacy, distinguish yourself from the pack, grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafety coach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.
The Safety Guru with Eric Michrowski
More Episodes: https://thesafetyculture.guru/
C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/
Powered By Propulo Consulting: https://propulo.com/
Eric Michrowski: https://ericmichrowski.com
ABOUT THE GUEST
Dave Hofmann’s research focuses on organizational climate, leadership, and organizational change, organizational design and decision-making. He teaches courses in organizational behavior, leadership and the complexities of middle management. Dr. Hofmann served as associate dean for the full-time MBA Program, area chair of organizational behavior and senior associate dean of academic affairs. A specific focus of his research is the impact of leadership and organizational culture on safety and errors in organizations that operate in high-risk environments. He has edited two scholarly books on these topics, including “Errors in Organizations” with Michael Frese.
In recognition of his work’s applied implications, he received the American Psychological Association’s Decade of Behavior Research Award in 2006. He received a Fulbright Senior Scholar Award to study errors and safety issues in organizations at the University of Giessen in Germany, and Robert Wood Johnson Foundation grant to investigate error management and organizational learning on nursing units. He has served on two National Research Council/National Academy of Engineering committees. The first investigated the causes of the BP Deepwater Horizon accident, and the second focused on how to improve safety culture in the offshore industry.
Dr. Hofmann has presented his research or conducted executive development sessions in Australia, Canada, France, Germany, Hong Kong, India, Netherlands, Singapore, Spain, Switzerland, UAE and the U.K. He earned his PhD in industrial and organizational psychology from Pennsylvania State University, his master’s degree in industrial and organizational psychology from the University of Central Florida, and his bachelor’s degree in business administration from Furman University.
EXECUTIVE SAFETY COACHING
Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.
Safety Leadership coaching has been limited, expensive, and exclusive for too long.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.