Select Page

Lessons from Aviation Safety: Human Factors with Dr. Suzanne Kearns

The Safety Guru_Dr. Suzanne Kearns_Lessons from Aviation Safety Human Factors

LISTEN TO THE EPISODE

ABOUT THE EPISODE

To err is human… but to learn and identify organizational ‘accidents waiting to happen’ is more critical than to lay blame. In this episode, we’re TAKING OFF with Dr. Suzanne Kearns, Associate Professor of Aviation at the University of Waterloo and Founding Director at the Waterloo Institute for Sustainable Aeronautics. Suzanne shares critical insights in Human Factors and Aviation Safety to help listeners explore how aviation has been able to make substantial improvements in safety and, more importantly, how it can help every organization improve safety performance.

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people. First, great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is the safety Guru with your host, Eric Michrowski, a globally recognized Ops on safety guru, public speaker and author Are you ready to leave a safety legacy? Your legacy success story begins now.  

Hi, and welcome to the Safety Guru today. I’m very excited to have with me, Dr. Suzanne Kearns. She’s a professor of aviation, safety training, methodologies and human factors of the University of Waterloo, where she explores a lot of topics around human limitations and how they contribute to accidents and incidents. Former airplane and helicopter pilot, she’s also the founding director of the Waterloo Institute for Sustainable Aeronautics, which we’ll talk about shortly. So first, Suzanne, tell me a little bit about your story, a background and your passion for safety. 

Thank you, Eric. Well, it’s a pleasure, first of all, to be here. And I can tell you that I didn’t have dreams or aspirations when I was young of being a professor. I just loved aviation. I think I was fascinated by flight in general. I sort of still do think it’s magical, even though I do understand the science behind it. It’s just something captivating about it. And so, I grew up flying airplanes and helicopters. Starting when I was 15. I with helicopters North Bay, Ontario, doing some really fun flying in the Bush, where he actually used chainsaws and build your own landing pads and quite rugged. 

And then at that time, because in Canada, piloting hasn’t always been a university level discipline. It’s more so a college level discipline. And I just finished a college diploma. I was looking for University education. So, I actually went down to Ambria Aeronautical University in Daytona Beach, Florida, and finished my bachelor’s degree. And then at the end of that, two really big, life altering things happened. A colleague on campus was really tragically killed in a training accident and simultaneously, within a matter of months, 911 happened really shook the foundation of who I was and my dreams and the idea that the industry that you love and that you find inspiration and excitement and passion and is used to cause so much pain and devastation and widespread hurt around the world. 

It really did cause me to rethink what my path would be. And so, I went back and I earned my master’s degree in human factors, which is kind of like the science of why people make mistakes. I came back to Canada, and so that’s when I started as a professor and earned my PhD in education. 

Excellent. Well, thank you for coming here to share a little bit about your story and some of the backgrounds. And I think one of the pieces I’d like to touch on first is really the linkage between some of the principles of safety that we talk about in the aviation world versus what happens on the ground, because I think in many ways the aviation space is probably the most advanced when it comes to safety and really understanding the human and natural limitations that we have, for sure. 

Yeah. Well, I could tell you a little bit of a history of how we’ve gotten to where we are today in aviation with our understanding of safety. And I think what’s important to understand, if you look back to the 70s and 80s, there was this culture where if a pilot was to make a mistake and survive an accident, then they would be fired and removed from the industry. And I think that everybody’s responses finger pointing at that person, how could they have possibly made such a terrible mistake? 

Which, of course, has devastating impacts because not only is there the pain in the accident, but there’s also that individual is going to be experiencing a lot of traumas from that, because what we learned over time was that number one, in the late 1970s and early 80s, there were a series of very high-profile aviation accidents that were primarily caused by pilot error. And it really challenged the industry to say, how is this possible? How could such an intense network of intelligent, dedicated people make such obvious mistakes? 

Eastern Airlines, like, for one, is probably the make an obvious example. 

Yeah. Which is just a faulty light bulb which caused so much focus of attention. They didn’t realize they disengaged the autopilot and flew their aircraft into the ground. And so, these kinds of things challenge the industry. So, what happened was a really amazing example of government, academia, and industry coming together to say, what can we do about this? And they created the first human factors training program, which they called now it’s called Pro Resource Management training, or CRM, meant to teach pilots about human limitations. But that’s only one part of it, because that still puts, I think, a lot of the focus of blame on the individual, and it doesn’t ask broader organizational questions around.

Is it really that person’s fault if they have faulty equipment or didn’t receive training or they have been on a schedule that’s impossible, and any human would be tired or exhausted? So, it also shifted at the same time. So, we have human factors. But we also have the organizational approach to safety. What this does, it looks at the entire organization from the top all the way to the bottom and making sure that everybody is in identifying areas of risk and eliminating them before an accident happens. 

So, it’s not just about the end pilot user. It’s about everybody that contributed to that accident or that flight on a particular day. And I think there’s a lot of parallels and a lot of learnings that come of that space that could definitely be translated into a lot of other environments. I know you’ve done some work on some ground safety, I believe on the main inside of aviation. What are some of the parallels that you saw when you were translating principles from human factors to workers on the ground that could be exposed to hazards? 

Absolutely. Well, I think what is very universally true is that we’re all human beings. And so, the same types of limitations that one experiences as a pilot or a maintenance engineer, an airside worker. These are all the same basic issues because they’re all about our natural bodies and our minds. So, when I’m explaining this to my students, I always say if somebody makes a mistake and you pull that person out and put any other random person in with the same types of background and experience, if that new person, it’s feasible that they might make that same mistake, then we really need to question if it’s fair for us to be blaming all of our focus on that individual. 

We really need to look at the task of the environment and the situation. But what I did find in translating it is that you have to articulate this gets such an emotionally impactful and sometimes challenging issue, because if you don’t articulate it correctly, it sounds like you’re questioning a person’s competency or questioning their commitment to their job when in reality, what you’re just saying is we’re all people and our limitations can be scientifically predicted and tracked. So why don’t we learn all of that information and take it in before it leads to an accident? 

But it does require us to make sure what that core message, that it’s basically being wrapped in something that is true to the job rule, and that is using the right language and examples to that role. 

That makes a lot of sense. So, tell me about the importance of education when it comes to safety. 

Yeah. Well, I’m a big education, not the focus of my world is in trying to support the next generation and trying to teach them as best that I can to support the future of our industry. So that being said, as much as I love teaching, and I think some of my most exciting and powerful experiences professionally have been in classrooms as a teacher. That being said, education is not always the best way to eliminate risk in an organization that from a human factor’s perspective. If you change the task that they’re doing, if you change the equipment that they’re using operational environments and noise, temperature, distractions, a lot of those things are, I think, universally easier ways to eliminate risk. 

And sometimes I think it falls back to where we’re using education as a default that it’s too challenging or expensive to change some of those bigger structures of a trip. And so, we try to solve a problem by throwing a few hours of training at the problem. But I think it really does offload some of that responsibility to the workers. And I think we have to question and always be really careful. Is that ethical? And is that fair, or are we really putting our priority on the appearance that we’ve done something rather than investing our best effort to actually reduce that risk? 

I think that’s an important the hierarchy of controls, really in terms of eliminating the hazard that could be present as opposed to trying to train the individual to manage it. 

Yes. Exactly. And the reality, we know from a human factor’s perspective, that training is one of the tools in your toolbox that you can use to support big organizational change and improvement to make things safer. But it’s not the only thing. And sometimes it’s the more expensive and the one that has more substantial ongoing costs over a longer period of time. You can imagine, for example, if we’re looking at cars, we all know that texting and driving is very dangerous, that nobody should ever do that. But if we’re teaching a person like how much energy and effort has gone into teaching teenagers don’t do this right. 

The so dangerous you should never do this. But if there was a way where the cell phone itself could just disengage while it’s in a car, for example, then that equipment shift eliminates at risk. Right. And then something quite simple where it has, like a sleeping widespread. Obviously, there’s other implications in that example, but around. But I think it’s a much more effective way to eliminate the risk of that one situation rather than putting the emphasis on the people and through training. 

This is similar to a lot of Ops is in cars that will typically stop working or stop allowing to be managed or controlled when the vehicle is in motion. You could do the exact same thing with a texting device. 

Exactly. And I think, of course, it’s a simple example. But if you think of the parallels to aviation, I think it’s still very true that it’s such a heavily regulated industry. And so, we’re always trying to provide the evidence that’s required to the regulators. I’ve completed 5 hours of training on this. You’ve demonstrated that you’ve taken that action. But I’ve had some really interesting talks with international regulators at the highest level around this hour’s metric for training, because in aviation is always based on hours. Or I get to do 5 hours of this or 10 hours of this. 

And I said, why hours? Because everyone knows it’s not the hours that make an impact. It’s what happens during that time. It’s the experience and learning. And he said, okay, I’ll tell you a secret. We know that. But he said the reality is put yourself in my shoes. If I’m an international regulator and our safety board has identified some sort of a safety deficiency, he said, that the most obvious and direct thing that I can do is to throw a few hours of training at the problem because it shows that we’ve made this effort to address it. 

But he said, even I know that it’s not going to fully eliminate that risk for me. That was mind blowing, because some of you love aviation, like you grow up aligning everything under these regulations, and when you come to learn that they’re made by people as well, and people are perfect and just doing the best they can under challenging situations, then it does allow you to really refocus, and I think question whether there’s an opportunity to do things even better. 

Great. So, some of the topics you talked about earlier on when you’re talking about human factors was around care resource management and how that got cascaded for listeners that aren’t familiar with crew resource management. And maybe some of the elements in terms of how human factors can get trained or taught to pilots. Can you give maybe a bit of a highlight as to what the core principles are? So, people can think maybe about how we could translate to on the ground examples? 

Yeah. Absolutely. So, crew research management, as it is today, is required annual training for almost all pilots in the civilian world, and it has a few core components that includes things like workload management. So, in our world is fly the plane first, so Ava, then navigate, then communicate. So, it’s this task prioritization. So, workload management is number one. Situation awareness is number two, and situation awareness is sort of like if you’re in your operational setting, it’s your mental picture of everything around you. And people may be shocked. 

But one of the most common categories of accidents is called controlled flight into train. So, it’s flying a perfectly good airplane into the ground, which is a result of a lack of situation awareness. And that’s a very big one as well. Communication and crew coordination. So how you talk to and use the resources around you, including the technology, but also all the people in the aircraft and air traffic controllers and other supporters on the ground. So those are some of the big categories, but it’s based on a very robust and deep interdisciplinary field of research, which maybe doesn’t mean a lot to people. 

But I can tell you when I’m teaching human factors, I don’t teach, like a list of memorizations. So new pilots will learn something called the I’m safe checklist where before they go flying, they should do illness, medications, stress, alcohol, fatigue and eating kind of checks like, am I okay? All these categories. So that’s what most pilots know at the very beginning. But when I teach it at the University, it’s sort of a foundation of your natural human limitation. So, it’s some psychology and thinking. How much information can you reasonably be expected to retain at any one point in time? 

And when is the any person going to start making mistakes? It’s your senses so all of your senses, how you take in that information and how it could be tricked and distorted and how you can’t always trust your senses. It’s anthropometry, which is the measurement of the human body, because in an aircraft, all of the controls have to be within a certain level of reach of humans. And it’s the limitations of work. So, when anybody in the world would expect it to be come tired and start making mistakes, whether it’s due to a lack of sleep or just a prolonged period of mental or physical work. 

And there’s also we get into some issues around things like mental health and substance abuse, because those are also very human things that affect all of us in our population. There’s a lot of other factors I’m probably missing, but that’s kind of how we build it up the foundational building blocks. And if I have the students take away one thing, it’s that its air is human, that you shouldn’t expect people to never make mistakes. It should be the exact opposite. You should expect that it’s 100% normal for even the most competent professionals to make mistakes. 

And if you start from that foundation, then you can build up to say, where are those mistakes most likely to happen? And how can I manage to capture them before they may have an impact that costs for. 

This episode of The Safety Guru podcast is brought to you by Propulo Consulting. The leading safety and safety culture advisory firm. Whether you are looking to access your safety culture to develop strategies to level up your safety performance, introduce human performance capabilities, re energize your BBS program, enhanced supervisory safety capabilities, or introduce unique safety leadership training and talent solution. Propulo has you covered. Visit us at Propulo.com. 

That’s excellent. So, tell me a little bit about you talked about human factors. How does a contrast to safety management system as it pertains to the aviation the world? 

Sure. Yeah. So, this is probably one of the biggest confusions in the aviation world is that even career professionals sometimes don’t know the difference between human factors and safety management. So human factors are kind of like, like I said already, it’s a scientific discipline and why people make mistakes. So, a little bit of psychology, economics, physiology. So, all of these sorts of scientific foundation, it’s all of education. That’s a really big turn as well. And then that leads up to CEER resource management, which is where we teach operators about all of these limitations and give them some strategies about how to avoid them and how to work together and how to avoid error. 

And that’s sort of in one category. And then the second category is its organizational factors associated with safety. So, most people in aviation most commonly noticed through what we call reasons. Model reasons. Model has layers of protection. So, there’s, like, five squares you can imagine, but then each one of them has holes in it, which you can think of as layers of Swiss cheese. So, each layer has its own holes and each layer represents it from the highest-level senior management. And then as you work forward, it’s not training manager. 

And then the far piece is the actual operators like the phone. And the concept is that those holes in the layers represent latent failures. So, they’re like accidents waiting to happen, whether it’s management not investing up in training or a maintenance engine who has a poor practice where they’re making a mistake over and over or whatever it happens to be there these opportunities for accidents. And then it’s only when holes and all those layers and line up perfectly that an accident happens. So, the concept is that the accident itself is actually quite rare. 

Instead of focusing all of our attention on the accident, which is what we had starkly done, fire the crew, it doesn’t address all those holes. Those risks are still in that organization. So, the concept of safety management systems at its core is the identification and illumination of those latent failures before they have an opportunity to line up and cause an accident. So, it’s a proactive rather than reactive approach to aviation safety. 

Alright. So essentially reducing the probability of those holes lining up. 

Yeah. And human factors play in because human factors can create those holes through the whole system. So that’s one of the ways we can reduce those holes. But human factors can’t address everything, because like I said, if there’s, like high level management, managerial decisions that are affecting every part of the operation and equipment, then no matter how hard a pilot at the end tries to do the best, tries to be as professional as safe as possible. They don’t have control over those other factors, or they will be influenced by them regardless. 

Excellent. So, we love to talk a little bit about the work you’ve done at Waterloo, which is really interesting. As I mentioned on the front end, you’re the founding director of the Waterloo Institute for Sustainable Aeronautics. Tell me a little bit about what this Institute does and what are you trying to accomplish with all of this linked experience that you’re bringing together? 

Thank you. Well, I’m really excited because our Institute is we’ll be launching on October 5, 2021, and it’s the product of years of work. But what really led to the Institute for me was when the pandemic hits and in my field, like so many others, so many people are out of work, alumni, friends and colleagues, just tremendous devastation. And when I saw so many of my colleagues who were 100% focused on just survival of other organizations through pandemic, I started questioning, what can I do to support this sector that I care about? 

I’m an academic. I was in university. I can’t necessarily impact business decisions, but I kept questioning what could be of value that I could contribute during this time. And reflecting on the big challenges that aviation was facing before the Pandemic, which I sort of defined as widespread personnel shortages on international scale, the growing environmental emissions. So, if you remember the right Greta Thunberg like Shaming movement was really just growing when the Pandemic hit, as well as the rapid evolution of technology. And when you think of those three things, they really aligned with the three pillars of sustainability is social, environmental and economic sustainability. 

And at the same time, I also saw now my other big aviation universities around the world were actively recruiting in areas where Waterloo has tremendous world leading experts already. So, they’re looking for AI experts in cyber security and everything in between. And it really hit me that we could have a tremendous impact in supporting the sector to set aviation and aerospace up for a more sustainable future after the Pandemic. If I could mobilize the strength that we’re already at the University and sort of direct the powerhouse that is Waterloo towards these big challenges that I know we’re having a direct impact on the people in the industry that I cared about. 

So that’s how wise it came to be and really what it is now we have about 40 to 45 different professors as well as their labs and grad students. We have a really distinguished advisory committee with an honorary advisor of Commander Chris Hadfield and some amazing advisors internationally and some industry partners who are coming on board. And really what it’s meant to do is to form a bridge between aviation and aerospace and the University. So, if industry partners have challenges or problems, they can work with university professors to address those. 

And the beautiful part of that is in the process, they’re educating graduate students who then go on and support Canada’s, a knowledge economy, then become future leaders. So, we’re just getting started. But we’ve got a lot of excitement about what comes next. 

And when we talked earlier, you mentioned some of the examples of a linking experience. You talked about some examples of bringing engineering to the table machine learning. Can you give maybe some of the examples of how these themes come together and the power that it brings? 

Yeah. So, what’s different I think about universities as opposed to industry, is that industry problems are very multi-dimensional. Lots of different skill sets come together to create that problem. But then, when you compare it to a university environment, professors are incredibly high levels of expertise in a very narrow area, and they live within pockets of their own discipline. So, you have psychology in one pocket and engineering and another and health or science in another, for example. And then what I think is what we’re hoping to do with the Institute was to break down those silos and allow a connection between the different disciplines on campus. 

So as an example, I’m working with a few colleagues right now, and we’re looking at how pilots are trained because as I mentioned earlier, there’s sort of distinct personnel shortages projected internationally. And so how do we address that? So, I have one colleague to in cognitive psychology looking at the process of how people taking information and learn. Another who is in kinesiology who looks at hand-eye coordination and the development of those skills. Another who’s in optometry. So, she’s looking at how your eyes move across your environment and taking that information, and whether that can come together to be an indication of expertise. 

And another who is in engineering who looks at it’s a form of machine learning artificial intelligence. So, if you could take all those data points into a computer, basically, could the computer, then when a new person comes in and flies automatically assess their performance and automatically tell them where they’re strong and where they need to focus more to improve their skills by comparing to a big database of others. And I think the really exciting part of that is if we were able to do that effectively, you can then justify to the regular is moving more training out of aircraft flying in the real world and simulators, and that has distinct environmental benefits so that it’s far less emissions. 

And it’s also saving young people money because training becomes far more customized to what they need. So, they’re paying for less hours. And simulators are usually cheaper than the airplane. So, you’re hitting the economic, you’re hitting the social improvement and the environmental improvement when you see this beautiful, magical mix of all these disciplines coming together to address a problem that’s excellent. 

And I think when I see that when I hear your story, what you trying to drive is this multi-disciplinary view really can be used in other sectors, in other environments to really start meshing different levels of expertise to address safety challenges across the board 100%. 

And I think safety is such a perfect illustration of this because just like education, it’s not one thing. It’s not one discipline, right? So, you can’t create a perfectly safe system. Only looking at from the perspective of psychology, it feels like it’s almost like you have pieces of the puzzle and that’s one important piece of the puzzle. But until you identify and link those other pieces together, then you don’t have the full picture. 

I think that’s an incredibly important message in teams of that multidisciplinary view of things to drive things forward. Suzanne, thank you so much for the work that you’re driving around safety and the aviation space. As somebody who used to fly a lot. I appreciate that part as well, but more importantly, really coming to our podcast to share your story, share some ideas. I think there’s some really great examples, illustrations that people can take from what’s being done in the aviation space to translate it to really that learning organization thinking about as humans. 

Where are we going to make a mistake? Because it’s bound to happen. So, I really appreciate you coming to share a story. 

Thank you very much. 

Thank you for listening to the Safety Guru on C-suite radio. Leave a legacy. Distinguish yourself from the pack. Grow your success. Capture the hearts and minds of your teams. Fuel your future. Come back in two weeks for the next episode or listen to our sister show with the Ops guru. Eric Michrowski. 

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Dr. Suzanne Kearns is an Associate Professor of Aviation at the University of Waterloo. She is an internationally recognized leader in aviation education research, earned airplane and helicopter pilot licenses at the age of 17, advanced aeronautical degrees from Embry-Riddle Aeronautical University and began working as an aviation professor upon graduation at the age of 24. In the 16 years since, she has taught and mentored thousands of aviation students, and written/co-authored six books printed in multiple translations (including Competency-Based Education in Aviation, Fundamentals of International Aviation, and Engaging the Next Generation of Aviation Professionals). She has received several awards for research and educational works, frequently delivers invited keynote addresses at international conferences, and holds leadership positions with several international aviation organizations. In 2021 she founded the Waterloo Institute for Sustainable Aeronautics (WISA), which she leads as its Director.

STAY CONNECTED

RELATED EPISODE

Taking your Safety to the Next Level: Integrating BBS and Human Performance with Dr. Josh Williams

Episode 30 - Taking your Safety to the Next Level: Integrating BBS and Human Performance with Dr. Josh Williams

LISTEN TO THE EPISODE

ABOUT THE EPISODE

Behavior-based safety, human performance, cognitive psychology… It can be overwhelming to consider so many competing safety approaches. On this week’s episode, Dr. Josh Williams returns to advocate for a well-rounded approach to safety. Josh shares practical HP tools for learning from workers’ first-hand experience and taking a proactive approach to preventing SIFs. Don’t scramble to improve your organization’s safety once it’s too late!

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe, productive operations. For those companies, safety is an investment, not a cost for the C-Suite. It’s a real topic of daily focus. This is The Safety Guru with your host Eric Michrowski, a globally recognized Ops and The Safety Guru public speaker and author. Are you ready to leave a safety legacy? Your legacy success story begins now.

Hi and welcome to The Safety Guru. Today I’m very excited to welcome back to our show Dr Josh Williams. Dr Josh Williams is a recognized thought leader in safety and safety culture. He’s a winner of the Cambridge Setter First Place National Prize for Behavioral Science, well over 20 years working with organizations in helping them improve their safety, build strategies around safety culture and assess how they’re doing. Amazing to have you back on the show. Josh, you’re also with Propulo Consulting as a partner and an incredible thought leader in this space.

So, Josh, why don’t we start out with a quick introduction. I know you’ve shared this story a little bit before in terms of how you got into the safety space and what captured your passion around it.

Well, I appreciate that, Eric. Thank you. And apologies to the listeners. I’ve got a little allergy attack happening here, so hopefully I don’t sound too awful. But I was in grad school getting a Ph.D. in industrial organizational psychology, and I was a bit frustrated. It was very pie in the sky. Theoretical. I don’t mean that disparagingly for food or the research, but, you know, I want to real stuff. I wanted to get out in the world and do things.

And I was lucky enough to meet up with Dr. Scott Geller, who I think is in my mind is the fountainhead really for safety culture in the space. And he was doing really cool stuff going, you know, above ground mines, going to manufacturing facilities and doing stuff. And it was really fun. It was interesting. And I felt like I was making a difference. So that was kind of my introduction to the safety side. I had not even thought about safety as I kind of got into to grad school, but we were doing a lot of a lot of neat things.

And so, once I finished up, I signed on with this group and worked with them for a bit. So, it was it was really it was really interesting to feel like you’re doing things to not only improve culture and communication and leadership, but hopefully keeping bad things from happening to good people. So that’s kind of where it all started many years ago.

He said, well, we’re obviously talking today about how you keep those bad things away from happening today. Specifically, we’re talking about two concepts, human performance, and this other theme, which is been really an integration of themes around behavior-based safety and human performance. You’ve authored two quizzes. So one is that human performance leader dot com. So, it’s a quiz on how are you doing for a human performance standpoint in some ways and tactics to drive forward, as well as a Microsoft assessment that’s available on Propulo.com.

Let’s start first in terms of understanding why we need to talk about a human performance, so we know 90 percent or so of incidents, occurrences of a safety infraction happen because of some form of at-risk behavior, a behavior. B, safety has had huge leaps of impact in terms of it. But tell me about some of the missing pieces and why human performance needs to be part of the equation.

Yeah, and let’s start by giving behavioral safety its do. Look, this has been around for decades. Injury rates have dropped twenty eight percent over the last ten years. Part of that is due to behavioral safety in my mind. Look, the National Safety Council had that estimate. I think you mentioned 90 percent of all injuries are due in part to at risk behavior. And it’s a numbers game. It’s a probability game. Well, you know, it’s like going to Vegas if you’re there for an hour.

That’s one thing. If you’re there for two weeks, odds are you’re going to lose money. It’s just probabilities that Gabriel Sapience in part is founded on. If we can be safer, more often, we’re reducing the probability of incidents, as Tiger Woods just got in his third car wreck. I mean, there’s some risky behaviors happening, so. Right. So, the challenge, though, Eric, to your point, is that, quote, 90 percent of all injuries are due in part to risk behavior.

That in part piece is important. That’s from the national scene. So, I think it’s higher than 90 percent. They have to be in part pieces. Systems matter. We are a function. Behaviors are a function of environmental contingencies, which is just an academic way of saying we operate differently depending on the system we’re in. And give me a moment to Randy Moss for you. Sports fans out there was all in all kinds of trouble and all kinds of legal issues.

For years and years and years with the Vikings and with the Raiders, who were a highly dysfunctional organization at the time, this guy spent his entire career in and out of trouble until he gets traded to the Patriots. I’m not a Patriots fan, but they have tight systems. They’re a championship organization. And. All right. From Randy Moss is a model citizen, literally doing all these things for the community. I mean, maybe he had a midlife epiphany or maybe he got into a better system and turn things around.

Now he’s on Fox News or Fox Sports or whatever as an analyst. But this guy, same person in a different system. Right? He was totally differently. The system matters and I think too long on the behavioral side. We got into these quotas and these checklists, and these did you do your cards thing and forgot the big picture, which is fixed systems to influence behavior. So that’s why the system matters. And I’ll just say two quick things on human performance and the rise of human performance and the integration of that with behaviors and mindsets.

First, fix the system. Second, quit blaming people when things go wrong. And then really, I think the HP side has been really good for safety and I think it’s helping a lot of people stay safe.

That’s excellent. But human performance is not a new thing. I remember when I started the airline industry many, many moons ago, this was a common topic of conversation. Can you tell me a little bit about the history behind it and some of the themes that drove this re-emergence? Because now we’re talking about a lot in the safety space?

Well, I mean, they were talking about with fighter pilots in World War two, you know, how do you change the cockpit to set it up to reduce air in stressful situations? I mean, the I industrial psychologists like myself many years ago talked about leadership in selecting leaders and training leaders, but they also talked about setting up the planes so that you don’t unintentionally do something bad. So, this has been around for a century. And there’s been various iterations, as you know, and certainly it was a big part of part of this many years ago.

And there have been other folks that have focused on the human performance side. But there’s been a re-emergence in my mind over the last maybe five years. And I think part of it is we’ve got a little sideways on the behavioral side and didn’t always do it right. So, I think I think the bottom line, Eric, for me is that concern I have now, frankly, on the side is there’s a lot of theory and just quit blaming people saying I’m all in theory, but it’s also getting segmented.

When I first started out, cognitive psychology was the thing and there were all these cognitive consultants talking about ownership and personal things, reflection, and those, but that sort of gave way to the behavioral people who were saying, look, we’re talking about your feelings, let’s do stuff. And you’ve got the human performance, people saying we’re talking about behavior because you’re blaming people, just fix the system. And the truth is, all those things matter if we’re not talking about you can a feeling that affects what I’m doing.

And if we don’t talk about systems, we’re just we’re missing a big piece of it. If we don’t talk about behavior, we’re missing a big piece of it. So that’s why what we call BEHA, which is kind of behavioral safety and human performance with some cognitive elements, too. That’s why it’s important if we don’t look at all three of these, you’re just incomplete in your efforts to get better. So, I think we need to be looking at all three of those.

I think a really good point, because it shouldn’t be a battle of philosophies and it shouldn’t be one thing or another. It should really be an element of how do we battle injuries, how do we battle safety, how do we make a tangible difference around it? And I agree with you, all these things matter. The mindset you have around your level of safety, ownership matters, the behaviors and how you shift those behaviors matter and the system hugely important overall.

So, tell me about some of the basic tenants that bishop or human performance bring to the table.

I think the main one, there’s a bunch. And again, if you go to some of those quizzes, there’s more information there you can take a look at. But the main one, I think, is we are human beings, and we make mistakes, whatever that whoever their years go to areas, it’s just so true. And the first point in my mind is, is we are efficiency machines. Human beings are efficient. We look for the easiest, fastest, most comfortable, most convenient way to do things.

That’s why I mean, why do you speed on the highway? Why does McDonald’s exist? Because fast food gives me food. And so, we’ve got to understand that. That’s so that’s part of the quit beating people up over stuff. Look, you put yourself out in the field somewhere where it’s 100 degrees, there’s six million things going on. Your production schedules are ridiculous. You don’t have enough people. And then you start telling me, be mindful of my behavior because I sprained my ankle.

Are you kidding me? So, we just have a little need, a little more sensitivity. First, we’re naturally inclined to be risky. And second, the system encourages it. And that’s where that’s where I think we sort of missed the boat there for a bit. I mean, time, pressure, insufficient training. We’re doing all this computer-based training. But look, I need hands on training. I need job specific training to what I’m doing.

But we’re throwing some computer thing at me that’s not helping. We don’t have enough people. Sometimes the conditions are difficult. You know, procedures may not make sense. You’ve got some blanket policy you slapped on there because somebody got hurt. But that doesn’t really apply. The excuse me, the biggest picture really is getting input from people doing the work. And that’s through close calls. That’s through other safety suggestions, through other means. We’ve got to and we’ll talk about some tools, hopefully, if we have time.

But the system is encouraging, and human nature is encouraging. And so, we really have to take a step back and look at how do we improve our systems, how do we improve our added our mindsets, how do we improve our leadership? How do we improve our behaviors? Because that’s really when you start seeing serious. There are two things there. I think, one, you’re going to see a better stability and performance where you don’t have this.

That’s a huge yes. Although we had five record bills last month, we had done six straight months of it. Yeah, because your systems aren’t very good or the scarier thing, which is that if potential all of a sudden there’s an explosion like BP where, you know what, eleven people or fourteen people get killed in a blink of an eye. They had given a safety award the day before, but they had lots of things lined up wrong at the same time.

The Challenger explosion, another one. So be careful because sometimes we have a false sense of security because our systems are poor and then all of a sudden something really bad happens. It kills a bunch of people.

Yeah, I think it’s a really good point. You brought up this theme around Tool’s. I think there’s a lot of people that are talking about human performance from a from a branding standpoint, but they’re not talking about how you actually go out and do something with it. What are some of the tools that you can leverage? Can you maybe share some of the ideas around what can somebody who buys into this element or in human performance do?

This seems to be a cultural component. And then there’s a tool-based component. Let’s maybe touch first on the on the tools and then we can talk about how leaders can start shifting as well, their approach to drive some of the cultural elements.

Yeah, right. And keep in mind, Deming said this years ago, don’t blame people for problems created by the system. So, when we started trying to fix the system, getting a. Input from people that are on the job doing the job is our first order of business, or at least one of the first. So, a couple of tools first tool will start at the top. First of all, listening tours where you’ve got executives, you’ve got senior leaders spending more time out in the field actually talking to people and not look, some leaders are great at doing that.

They’ve got a good feel for what’s going on out in the field that talk to me. They have a good relationship and that’s wonderful. And look, these people are busy. There’s a lot going on. They got a lot of things on their plate. But carving out time to go out in the field and talk to people is smart business. It’s good for you. It’s good for everything. So, one tool, I’ll call it listening tours where we have a little guide and it’s not coming down as a leader saying you’re doing this right, you’re doing this wrong.

You need to do A, B, C and D, it’s really asking questions. What’s going on out here? What are you struggling with? Help me understand what you’re doing. It’s about listening. It’s about being curious about what people are doing. It’s about asking how they’re doing on and off the job. And we provide a little guy with four or five things just to kind of reminder. But it’s just getting people getting leaders out in the field and better understanding what’s going on and trying to establish relationships.

The second tool, I’m going to call it a space, a pure check. And this is unlike a behavioral safety card where you’re you know; you’re checking a bunch of things. Hard to get know this, that yes. No, this card. And there’s no quotas with it. There’s no names on it all. It has questions like, what do you need to do this job safer? What scares you about the job? How could somebody get hurt?

What do you need? What would you do if you been doing these twenty-five years? What would you do differently on this job to keep you and other people send them to questions? And if we’re in, the nice thing is we’re having better conversations with people because we’re asking them questions. And on the back end, we’re getting information we can use to make things better, because if we find people are telling us, you know, we have a scaffolding issue over here, well, good, we can go fix it.

And if we do a good job of responding to concerns, fixing things and advertising improvements, it’s better for safety and it’s better for culture because all of a sudden people realize they care and they’re doing stuff. So those would be the first to learn.

If I can add on the on this on this last one you just shared is to me, this is also an element of I don’t necessarily know, quote unquote the truth, that there may be a safer way that I haven’t thought. And I’m pushing, thinking, and pushing critical decision making at the frontline level to reimagine how could we do this better as opposed to cabbing, pontificating about I know how to do it. I’ve got my checklist and it’s either yes or no, but there’s no alternative.

Better way to do it.

Absolutely, well said, absolutely.

So, you’re going to add a third one and I cut you off there.

Well, learning teams is the big one where you get a group of folks and they go out and, you know, like a pre incident analysis, borrowing that from talking on you, different language. But the idea is instead of just reacting once an incident occurred and trying to be more system focused and focusing on potential serious injury and fatality potential down the road, send people out before an incident occurs to see where something could go wrong here. So, learning teams are going out, appear to stay safe.

Your check is generally for a particular job. Somebody is doing a learning team. You’ve got to pull is going out, walking around, doing a tour and saying, you know what’s going on. And they’re asking questions. They’re talking to people they’re making notes of. And there’s some really good ideas. I mean, there’s so many creative ideas out there about restructuring the work and the flow of the work. I mean, there are smart people out there, and if we talk to them and give them some voice and power, they have some great ideas for better, safer, more productive ways to do the work so they can be more productive, make more money.

Everybody’s happy. So, giving power to those learning teams is that there’s a bunch of three-way communication time outs. There’s a laundry list, but those three in particular I like.

And so, what for from a leadership standpoint needs to change. Where do you start from that leadership standpoint to make impact the whole burning platform thing? And, you know, it is big. And I’d be curious to get your thoughts on this, too, Eric, but to me, the primary understanding from leaders needs to be the need for change. I’ll say what I said earlier, but you are not going to get stabilized performance. You’re not going to have predictability because everyone’s so focused on trial.

Are rates going up this month? They went down last month. And it’s going up and down and up and down. And I don’t know why. And all of a sudden, we had a flurry of incidents. So, you’re going to get more stability and performance, less deviation around the mean, whatever those rates are, because there’s stability, there’s predictability by tightening up systems, we get more predictability. Second one is the potential reduction, because look, I mean, we you and I both for many, many years have seen these really bad things happen where all of a sudden, a serious injury happens.

I’ve got way too many stories of talking to people that have been involved in incidents or safety leaders that have to make those phone calls to people’s homes when somebody dies on the job. I mean, it’s sudden, it happens quick, and it catches everybody off guard. And then all of a sudden everyone scrambles and tries to make improvements. We need to reinforce with leaders, do it on the front end before that really bad thing happens, because those dangers are out there and some of these places we work, there are so many things I can get you seriously in a hurry.

So, the burning platform is the issue with leaders like, look, you got to understand, making money is good and the safer we are, the more money we’re going to make anyway. I mean, this is not this and we’re dragging along. It’s embedded in who we are and how we operate. If we really like Paul O’Neill did all those years ago to improve safety, he came in, revolutionized how safety was looked. It’s part of the character.

It’s not something we do. And lo and behold, profits soared. Now, there are some things he did. I might do a bit different, but he’s for or for sure came in with. Safety is part of who we are. And we are not just doing this as a slogan. So, I think understanding it, feeling it, that personal ownership is there and creating that burning platform for leaders is step one. And then we start talking about tools and other things.

Yeah, I think that makes sense. I think there’s this whole element of our own philosophy has quit blaming your employees that that needs to be thought through shared with leaders for there to be some real sizable impact because it’s a different way of showing up. I think it makes a ton of sense. But there is a there’s a difference there. So, a lot of the debate I here is because a lot of people are dogmatic about everybody’s safety or about human performance or like you talked about cognitive psychology, applied to safety.

Do you need to say, let’s go do this thing or do we just suddenly start infusing the thinking, the philosophy and not necessarily even branded?

So, there’s an interesting question, and I would argue that sometimes we do too much flavor of the month where there’s something there’s a bunch of fanfare and then something else comes along. So, my hope is that human performance elements are embedded naturally with an incident analysis for sure. I mean, we need to do a much better job of looking at system factors, contributing to incidents and also what are the potential for future incidents. So that’s close call reporting. Same thing, other mechanisms for getting employee concerns and safety suggestions that stuff should be happening naturally.

So, you don’t need to buy a widget called Human Performance. Yeah, having said that. I think some I think the training and education and tools are useful and what we do when we go in and work with folks, for instance, someone who’s got a behavioral program, but it’s turned into a quota system or something. We don’t abandon behavior-based safety altogether. Bihar, where we update the car to have more open-ended questions, to generate better conversations.

So, we you know, to me, we embed some of the human performance elements and systems that are already there. We’re dovetailing we’re not scrapping something and creating something. Sure. So, I think it depends on the depends on what you need with what the organization needs. Sometimes a human performance implementation is smart or a big limitation of smart. Sometimes we’re just tweaking what you’ve already got.

And I like your approach around Bishoff in terms of really integrating the behavior-based safety elements that work well have been proven to drive down injuries with some of the elements of human performance in terms of truly making an impact. And I think that also needs to be augmented by some leadership capabilities and leadership thinking in terms of evolving how we approach safety. A great place to get some ideas around it. Just a quick self-assessment that that you’ve created at a human performance leader Dotcom.

So human performance leader dotcom free self-assessment doesn’t capture anything personal about you, just for you to self-reflect in terms of how you’re doing, how your organization is doing around it, plus that the mini self-assessment that you’ve created at Propulo.com under the self-assessment pages. So, Josh, really appreciate your thinking on this. I think it’s just such an important topic. I think it’s really the future around safety is bringing a lot of these capabilities together to the table.

When you share a lot of the stories, it reminds me of so many things that even the quality movement have been talking about. There are so many similarities around it, just different names. But fundamentally, it’s the same core principles around how I go do listening tours or Gemba walks. I speak to people, have an open mind to think that that maybe there’s a safer, better, higher quality way to get it done. And those who are closest to the work are most likely to want to do more.

And it’s a proven way to tap into people’s discretionary effort. So, Josh, any closing thoughts on the topic of human performance and this integration between mindset, behavior and most importantly as well, adding the system because all of these things interact with each other, they’re not in isolation?

Yeah, I guess my final thought, you know, leaders have a tough job, whether it’s a supervisor or a higher-level leader. It’s there’s so much going on. And with covid hitting and people scrambling there and worries at home with kids and all this stuff, I mean, look, everyone’s scrambling. And my final thought would be if we can infuse some of these HP elements and do a better job of getting and using feedback from people doing the work, the benefits everybody makes, it makes life easier for those leaders and it certainly makes life better for our employees.

So, it’s to me, it’s a helpful way, which is good not just for safety, but for everything.

Well, thank you so much, Josh. Always a pleasure having you on the show. Thank you for taking the time to come back and share some thoughts and insights around human performance. Really encourage people to start thinking about how I can include some of these principles, these ideas into my safety program. It’s been proven if you look at performance in the airline industry, you look at performance in the in the nuclear industry where these capabilities are deep, deep, deep, embedded.

It’s a proven tool kit, the looking at the system, as is demonstrated to drive results. Thank you, Josh.

Well, thank you, Eric. I appreciate it.

Thank you.

Thank you for listening to The Safety Guru on C-Suite radio. Leave a legacy, distinguish yourself from the pack, grow your success, capture the hearts and minds of your team. Fuel your future. Come back in two weeks for the next episode or listen to our sister show with the Ops guru, Eric Michrowski.

Please read more in Josh’s related blog about Human Performance (HP): https://www.propulo.com/blog/harnessing-the-power-of-human-performance-to-improve-safety-culture/

Please read more in Josh’s related blog about Behaviour Based Safety (BBS): https://www.propulo.com/blog/bbs-2-0-fueling-discretionary-effort-to-prevent-sifs/

Take the following self-assessment to gauge the current effectiveness of your Human Performance efforts: https://humanperformanceleader.com

Take the following mini-assessment to gauge the current effectiveness of your Behavior-Based Safety (BBS) process: https://www.propulo.com/selfassessment/

Additional online Self-Assessments are available at https://www.propulo.com/selfassessment/

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Dr. Josh Williams is a Partner with Propulo Consulting, a global management consulting firm delivering significant and sustainable improvements in organizational performance. For over 20 years Josh has partnered with clients around the world to drive increased discretionary effort and improved strategic execution. He’s the author of Keeping People Safe: The Human Dynamics of Injury Prevention and received the Cambridge Center National First Prize for his research on behavioral safety feedback.

STAY CONNECTED

RELATED EPISODES