Select Page

Soaring with The Blue Angels: Building a Robust Safety Culture with Scott “Intake” Kartvedt

Soaring with The Blue Angels: Building a Robust Safety Culture

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

Get ready for takeoff on The Safety Guru podcast! In this episode, we’re soaring to new heights alongside an experienced professional pilot, the stunt pilot from Top Gun: Maverick, Scott “Intake” Kartvedt. He shares the foundations of a robust safety culture, highlighting key strategies incorporated by the Blue Angels and the Navy. Gear up to elevate your organization to a top-tier safety culture. Don’t miss this flight!

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people first. Great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C suite. It’s a real topic of daily focus. This is The Safety Guru with your host, Eric Michrowski. A globally recognized ops and safety guru, public speaker, and author. Are you ready to leave a safety legacy? Your legacy’s success story begins now.

Hi and welcome to The Safety Guru. Today I’m very excited to have with me Captain Scott Kartvedt, I should call him Scott “Intake” Kartvedt. He is a former fighter pilot. He was with the Blue Angels as a commanding officer and also a stunt pilot in Top Gun Maverick. Scott, thank you so much for joining me. Quite an impressive background. Welcome to the show.

Eric, thanks for having me. It’s a pleasure to be on and I look forward to talking about safety and some of the challenges that we face as human beings in all workplaces. But it’s a topic that you just can’t beat the drum enough to keep our peers and our fellow workers and human beings safe.

Excellent. Well, let’s start with a little bit about your background because it’s quite an impressive resume. I think every boy’s dream growing up. So, tell me a little bit about your background all the way into the blue Angels.

Yeah, absolutely. So, like so many people my age, I saw the movie Top Gun, the original when it came out in May of 1986. And my best friend and I told all of our friends that we were going to be fighter pilots. And subsequently, we both went to college. I worked as an accountant for a while, about a year after school. And my best friend, Bob called. He went to ROTC with the Air Force. And he said, hey, I got my pilot slot. I’m doing it. We said we were going to do it, and I’m doing it. I said, okay, I’ll do it too. I picked up the yellow pages and called the Navy recruiter, joined the Navy as a pilot, ended up going through flight training, was successful, selected jets, and was able to select F-18s. That was the start of becoming a fighter pilot. I f from that point was just cutting my teeth as a fighter pilot, using the weapon system, which was the F-18. I was forward deployed in Japan. In the Taiwanese contingency operations, we were the China watchdog, the North Korea watchdog. I came back from Japan and was an F-18 flight instructor and also a landing signal officer.

For your listeners, the landing signal officer is the pilot that sits at the end of the aircraft carrier and is an aid or a safety spotter, if you will, to ensure that the planes are coming in on glide path, line up, and in the proper angle of attack or attitude of the aircraft when they land on the ship. That was foundationally where I recognized the need and the importance of safety. I went with the Marines to teach the Marines how to land on aircraft carriers at Marine Corps Air Station, El Toro. We subsequently moved to Mir Mar with the Marines. Then I was selected to become a member of the Navy’s flight demonstration team.

Tell me a little bit about the Navy and the discipline that comes in the Navy, but also in the aircraft carriers. What you describe is just landing a plane on an aircraft carrier, very difficult. You’ve got an 18-year-old that’s starting. How do you create that discipline where no matter where around the world you are deployed, you’ve got a consistent operation and safe operation?

Yeah, that’s a fascinating cultural safety ownership culture that the Navy is exceptional at because we take 4,000 sailors and we put them on the most lethal platform in the inventory and aircraft carrier. And there are, let’s say, 50 airplanes in addition to helicopters and E2s, which are propeller-driven airplanes, OSPRIs. So, it is a very high-risk environment. And you have young men and women who may have only had the good fortune of receiving maybe a GED, a General Education Degree, maybe academics wasn’t their thing, or they may have been high school, or I’ll just say school dropouts. And how do you create a culture where you instill ownership of each other and the ship and the airplanes in someone that is 18 years old and first deployed on an aircraft carrier? You have to give them ownership. You have to give them the authority and the responsibility to stop flight operations. And the example, though, to give you, Eric, and this is true on all aircraft carriers. If an 18-year-old who might be a plane captain is on the deck of the aircraft carrier and look in their toolkit and see that they might be missing a tool.

Sure. We have to have a safety culture where they don’t immediately think, oh, I’m going to get in trouble. I need to hide the fact that I lost this tool and I hope I find it later. We need them to raise their hand immediately. We actually have them put their hands up in an X in front of them, like a giant X, to verbally or nonverbally communicate to everybody else to stop. And as soon as you see somebody that is putting the nonverbal X because it’s loud on an aircraft here, it’s got to be nonverbal. As soon as you see that, everybody else does it. And then all operations stop and we find out who created it or stopped operations. We run over to, in this scenario, the 18-year-old, and we say, what happened? They say, oh, I lost a tool or a wrench, and I think it might be in that F-18 because that’s the airplane I was working on. You have to have a culture where you say thank you. Thank you for your courage and the integrity to stop operations because as soon as you punish that individual for their lack of responsibility, now you start hiding those small safety splashes that over time can build up into a catastrophic failure or loss of aircraft or fatality.

We are very good in the Navy at providing authority, responsibility, and ownership at all levels, from the captain of the aircraft carrier to the 18-year-old, and instilling in them the ownership of the airplanes, the ship, and the people.

How do you drive that? Because it’s easy to say that. A lot of organizations talk about it. I lived in the aviation space, where that’s expected as well. But in a lot of other industries, there’s always a questioning element. If that decision, if I stop work and I say I’m not prepared, and if I had to make a mistake as part of it, and there’s a repercussion, which in some cases in business can be hundreds of thousands of dollars, it’s very tempting to go, Let’s hide it. Nobody will figure it out.

Right. One, you not only have to say it, you actually have to believe it. It might take a period of time. When you take command of a fighter squadron or a ship in the military, it’s only for a short period of time. You’re not a CEO for 5, 10, 15 years. So, you have a short period of time to establish your culture and instill your values in your belief structure, the integrity, the principles, and the character that you want to set for the organization. And it has to happen pretty rapidly. And so, you not only have to say it, you actually have to live it. And so, an example that I will give really quickly, and this was the year that we won the Safety S in the F-18 squadron that I had command of, we went 486 days over the course of two deployments with no alcohol-related incidents. The same safety ownership culture that we had on the ship and in the air wing, I wanted to instill off-duty so that when we were in port, we were still taking care of each other in a safe environment. I said, look, if you’re going to go out and have a hoot nanny and do a little bit of drinking, one, we have to watch out for each other.

Two, don’t drink and drive. Don’t drink and drive. Don’t drink and drive. Don’t drink and drive. And if you get a cab, I will pay for it personally out of my pocket. Nongovernment money, my money. You bring the receipt in, and I will immediately stroke you a check or Venmo you in this case. And sure enough, one of our sailors came in on a Monday, handed me his receipt for 50 bucks, and I Venomed him the money. I immediately stopped operations, called everybody together, and honored him for doing the right thing, which was taking the cab. But I also had to back it up with my actions and do what I said I was going to do to prove to them that it wasn’t me really seeing if they were drinking. I didn’t care about that. I wanted them to live their lives, but I had to back it up with action. I really think that… And not patting myself on the back because it took 250 of us to earn that safety S, but it was that culture of living and doing what we said to take care of each other. Once you have that culture, then somebody new shows up, an 18-year-old who just checks into the unit, and that’s the culture that they…

And then it can live on until another leader comes in and either makes it even better or for some reason erodes that culture.

I agree. And how does training come into the equation? How do standards and expectations complement this? Because there’s more to just saying, these are the values, and I need to stop working.

Yeah. So, let’s pivot to the blue angels a little bit because they have the highest standards of any organization I have ever been a part of. And they hold each other accountable to those standards. And it’s really as simple as the pens that the autograph pen or the paperwork pen that we keep in our blue suits have to be in a very specific pocket. When we talk to people in the crowd line, we can’t wear our sunglasses. We have to make eye contact. There are little small things like that that they don’t necessarily tell you right up front, but it costs you $5 if you fail to meet the standard. When you first join the team, it costs you $50, 60, 70 a day. But then you learn exactly what the standards are. It takes a very short period of time to realize that what they’re teaching us is discipline and attention to the minute detail. Not only for the pilots, because we need that attention and detail when we’re flying, but our mechanics need attention to detail when they’re working on planes. The supply core needs attention to detail when they’re ordering the right parts.

Our administrative department needs attention to detail when they’re submitting the paperwork so our sailors get paid. Everybody has to have that attention to detail and that service, the customer service to each other, and hold each other to that standard. With that comes the debrief, Eric. You have to be able to debrief somebody when they don’t meet the standard. One, you have to have the standard set. This is what we expect. And then, if somebody doesn’t achieve it and there’s a gap between the expectation and the performance, you have to be able to debrief that. Most human beings, and I talk a lot about this when I consult companies, there’s an ego problem there, where people perceive the debrief as some form of punishment, and they get defensive to have failed to meet expectations. And on the blue angels, I realized that somebody wasn’t punishing me when they debriefed me or telling me that I was incapable. In fact, it’s the exact opposite. When someone takes the time to debrief you up to the standards, they’re actually telling you that they believe that you have the capability to achieve the standard or exceed the standard.

And once you realize that when you’re being debriefed, it’s because somebody believes in you and they know that you can perform at a higher level, then you can’t get debriefed often enough. You crave that feedback to improve and accelerate your performance.

Very similar to the concept of radical candor as well of, if I care about you, then I’m… And I believe in your potential that delivers feedback differently. But you’re absolutely correct. Many times I’ve seen conversations even between senior executives where they’re giving feedback on how to improve, and then they’re trying to justify as opposed to just saying, You’re not losing your job. It’s not impacting your performance bonus. This is just tips and ideas on how you can get better. What you describe is really key. It’s really how you have the conversation so you get to your optimal version of yourself.

Yeah, absolutely. In that, when somebody is debriefing you, the only really appropriate response is, Thank you. We immediately… It’s our human behavior to want to defend. Eric, if you were debriefing me on something, I would want to hear you, and then I would want to defend why I made the decision that I made, or explain to you what happened. That just takes time and gets into what we would call a circular conversation because now I’m defending myself. I could just say thank you, and I can take your input, and I can make myself better. Or if the feedback didn’t meet the scenario, then I know that, but I don’t necessarily need to explain that. I just need to take your input, recognize that you believe in me, let go of my ego, and then choose to incorporate it if I believe it will help me or improve or not. I hate to make it that simple, but the ego piece is significant for sure. Once you let go of the ego, then you can really, really, really accelerate your performance.

You said something a few minutes ago that really caught my attention. I expected when you talked about setting high standards in the blue angels, I expected if you didn’t do something, there would be some form of punishment. Instead, it’s the $5, which is often a ha-ha joke, but still sends the message. Tell me a little bit about how it’s done, because I’ve seen this where somebody would, as an example, every time you were late for a meeting, it was a buck a minute for your delay, and it went to charity. So, it wasn’t for profit to somebody, but it sent a message very quickly as opposed to chastising somebody for being five minutes late, embarrassing them. It was just a donation jar, but it drove the message very quickly.

Yeah, I think it does drive it quickly. And so, the $5, when you are hemorrhaging money to learn this, that is a behavioral tool, right? The carrot and the stick. It’s a little bit more of a stick model. Our money went to quadrant social functions. The debriefs are never a personal attack. It is just professional development. But it’s interesting. That dollar was being laid to a meeting, and this was another great thing that I learned in the Blue Angel. The briefs always started on time. The debriefs started on time. When the time started, that’s when the meeting started. The idea was to respect each other. There were 16 officers on the team that were at every brief and debriefed. If you waited one minute for one person to show up, you really just wasted 15 minutes because 15 people showed up on time. I took that philosophy into F 18 Command, and I would set up operational meetings that we had consistently every week, safety meetings, operational meetings, and maintenance meetings. We would always start them on time. I made the department heads that work for me crazy initially because they said, well, not everybody can be there.

I said, well if we wait until everybody can be there, it’s going to be a month from now. They can send a representative, which gives a depth of leadership and provides training for support sense. I said, Just because they can’t be there, that’s okay. But they need to at least send a representative. All you have to do is start on time once or twice, and then the person walking in late will realize that when you say you’re starting on time, you actually mean it. But as soon as you say, hey, let’s wait for everybody, you’re wasting the time of the people that were on time at the expense of the person that was tardy. So, whether it’s a dollar or just, hey, hack, the time is 930, and we are starting, that gets the point, and the whole command or organization will pick up on that.

This episode of The Safety Guru podcast is brought to you by Propulo Consulting. The leading safety and safety culture advisory firm. Whether you are looking to assess your safety culture, develop strategies to level up your safety performance, introduce human performance capabilities, reenergize your BBS program, enhance supervisory safety capabilities, or introduce unique safety leadership training and talent solutions, Propulo has you covered. Visit us at propulo.com.

In the Navy, as well as the Blue Angels, training is a huge component of onboarding. What’s the rule of thumb around training? When is it too enough? Is there such a thing as enough, not enough? Because a lot of organizations struggle with the training as a cost, right? And they’re trying to minimize the cost of that investment. Not the case in the Navy, not the case in the aviation space. So, tell me a little bit more about that.

Yeah, it’s interesting. I think, and not necessarily even high-risk organizations, but let’s talk about companies that do high power lines or high voltage electrical work, railroads, the airlines, those organizations, large organizations that have to train to maintain a level of safety. I’m sure you’ve talked a lot about the normalization of deviation on your podcast, right? And so, the balance between training towards perfection in safety and the expense, at what point does management or the people that are responsible to the shareholder say, well, nothing has happened, so we are training good enough. And now you are prioritizing shareholder budget return on investment over safety. And that’s when you really need to start listening to the people that are actually doing the work and ask them what they need to find out where the safety gaps are, because they will tell you for sure. And so, it’s a really fine leadership balance between, can you really be over trained? Probably not. The Navy Seals would say, absolutely not. You can’t be trained enough. But at some point, you actually have to stop training and operate. But even in operations, there is an opportunity to learn and take what you’re learning from the operation, wrap it back into the training so that you can minimize the risk while also improving the performance and the conclusion of the organization.

When I think of training, one of the things that to me is apparent, particularly in aviation, compared to what I see in a lot of businesses is, often, people see training as a one-time thing. So, it’s onboarding you, and I give you initial training, essentially. What I see in aviation, and I’m assuming in the Navy, is exactly the same, if not even higher, is this continuous training. So even if there was a near miss, as an example, if it gets to a certain threshold, you’re going to run through simulations that will recreate what happened to somebody else at some point. So, tell me a little bit about that, because that refresher piece to me is really key to focus and learning, but also not getting complacent.

Sure. What’s interesting about that, is that commercial airlines have to train their pilots. They come through a training center for simulator training every nine months. I think most non-aviation people would be blown away to know that the pilot of their group goes through training every nine months, two days of training, to go through what we would call nonnormal, nonroutine scenarios and even extreme scenarios. So that in the event it ever actually happened, they would have some muscle memory, some procedural recall to overcome the amygdala hijack, and the startle effect because you have to override the fight-flight or freeze. Aviation is great at that. Continuous training is really important. It drives the point home. There’s always something that you can learn from it. I think that aviation philosophy is spreading. I know that the health industry and surgical units are taking on board the idea of aviation briefs and debrief checklists to ensure things are done correctly. I know that there are a lot of industries that do that. But think about straight-up corporate America. I’ll just take some finance organization as a hypothetical, right? Maybe not high risk, but they do continual training where they are talking about diversity, equity and inclusion, sexual harassment, and those things that have to be continually brought back up to the forefront of the mind to ensure that people are not continually thinking about it, but trained to a level of awareness that is important.

I think the concept of continual training, as long as it’s refreshed, I think that’s an important piece because you can’t just play the same video a year ago because nobody will pay attention. It actually has a negative effect. But using real-world examples in your scenarios, which breeds transparency, lens, and credibility, where everybody can learn from something that happened in your organization, that’s the best time to continue the training for any organization.

Phenomenal topics. I think that the element of training in terms of what you describe is something, at least for high-risk roles, I think is important to do that refresh in terms of refreshing. Then the other element is the scenarios, the working through scenarios where something went wrong as opposed to just getting an email saying, hey, so and so had this issue, and this is how they dealt with it. It becomes a recurrent training and walking through different scenarios, I think, is key.

Eric, I have found that facilitated experiential training, even if its scenario-based, trumps computer-based training, and certainly emails all day, every day. People will generally respond to that in-person, facilitated, roundtable experiential training because now they’re learning from each other and sharing their stories. Once you get people sharing their stories, we’re good in aviation, right? There I was. But there’s a tremendous amount of learning that takes place in the three I was the type of scenarios.

You touched on something briefly a few minutes ago around safe today, not tomorrow. Tell me a little bit more because I think that is something many organizations struggle with. Because in safety, often there’s an absence of a leading indicator that tells you when you’re when this deviation that’s starting to be normalized in the process. Tell me briefly about what you mean by safe today, not tomorrow.

Yeah. So, you could have a level of training that is degraded due to cost due to budgetary constraints. And at the end of a quarterly result, you could say, Well, our safety record is still 100 %, and we reduced our budget. Therefore, we’re training to the proper level. And so, maybe we could cut a little bit more and save some more money on training to help our bottom line. I have worked with companies where I have seen that happen. And you can hear the rumblings among the workers that are actually performing the high-risk jobs. And as soon as that happens, you know that you have a gap, and you need to listen to them to find out what they need. And so the answer is that this is good enough or it hasn’t happened. Therefore, we justify the budget cut to the training department or to learning development. That’s a normalization of deviation where, just like the space shuttle, the rocket booster had had a ring leak 14 times, but it had never exploded. Therefore, the risk of explosion was minimized when, in fact, that was not the case. Just because you flip the coin 10-times and it lands on heads doesn’t mean it’s going to land on heads the 11th time.

The risk is the same on the 15th launch. And that’s when the O ring failed, even though there were people screaming about that problem. And I’m sure you’ve analyzed that a lot. But that normalization of deviation, you have to step and make sure that you’re not falling into the cognitive bias trap where plant continuation bias, overconfidence bias, the expectation bias where it’s worked before, therefore it will continue working. I think as a leader, we have to step back and go, okay, where is our risk? And have we cut back too far? What’s the risk to the operation? And if you want to know where the risk to the operation is, go talk to the operators. They’ll tell you exactly where the risk to the.

Operation is. I think it’s a really important point because it’s not you can’t save money, but you’ve got to save money in the right places. So, it’s not that you have to be the highest cost operator, but the flip side is the lowest cost operator isn’t necessarily the answer. Because I’ve heard somebody say, well, in this particular industry, the lowest cost operator is the safest. And I’m like, But that doesn’t mean it’s a correlation. That doesn’t mean it’s causality. It just means maybe they’ve got very good operational discipline and are good at it. They may be lower cost because of that operational discipline, and they’re tighter on safety. But you can also arrive at the lowest cost through cost-cutting, and we know what goes horribly wrong with that.

Yeah, absolutely. Causality they try to tie two things together that actually are related. And on that piece, I would tell the leaders that are listening to the podcast to go to the same operators and say, where can we cut costs? What do you recommend? Where’s the excess? They’ll tell you. They’ll tell you what they need, and they’ll tell you what they don’t need if the leader is willing to listen anyways.

So, tell me about your book, Full Throttle, From the Blue Angels to Hollywood Stunt Pilot. Tell me a little bit about why somebody should pick it up.

That book. Yeah. Well, I appreciate the book plug. I have had a very fortunate career, as we have talked about here. When I got asked to fly as a Stunt Plow to Maverick, the most common question was, how did you get to do that? And over the course of my career, how did you get to fly F 18s? How did you get to fly for the Blue Angels? How did you get to go on five combat tours? How did you get to stand up the first stealth fighter squadron in the Navy? How did you get to fly for Maverick? I got asked that enough that I had to boil it down to really three things. I say yes to opportunity because the same yes opens doors, and I am not afraid to learn from my errors. I talk about embracing failure. It’s really about embracing mistakes and failures, letting go of your ego, and being willing to learn. I asked for help on that same NASA subject because we were talking about normalization and deviation with the Challenger. I actually applied to NASA once, and all my friends said, intake, you’re never going to be an astronaut.

You’re not a test pilot. You don’t have an engineering degree. You’re an accountant. It’s never going to happen. I said, well, let me put it this way. Nasa is never going to call me out of the blue and offer me a position to be an astronaut. So, I have nothing to lose. All they can do is bring good news by saying you’ve been selected to be an astronaut. Because if they say no, you’re not an astronaut, I’m already an astronaut.

Right. So that’s been my philosophy. And then my dad was really the inspiration. He was a Submariner in the Navy. And he always said, Scott, your stories are just outrageous about naval aviation. You should write a book. My dad is still with us. He turns 86 this year, but he’s got the Rage. I thought, you know what? I am going to put pen to paper, and I’m going to tell my journey from having watched Top Gun as an 18-year-old in 1986 to 33 years later flying as a stunt pilot in the sequel and share that journey. And hopefully, people of all ages will find it inspirational, but also maybe take a tool and a life lesson from the book as well.

Excellent. Well, Scott, thank you so much for coming on the show, sharing your experience from the Navy, from aircraft carriers, Blue Angels, to now being a commercial pilot, and your recent book. I really appreciate the time you took with us. These are really great insights in terms of building a good discipline from a very early stage. Thank you.

Appreciate it, Eric. Thanks for having me on.

Thank you.

Thank you for listening to the Safety Guru on C-Suite Radio. Leave a legacy. Distinguish yourself from the pack. Grow your success. Capture the hearts and minds of your teams. Elevate your safety. Like every successful athlete, top leaders continuously invest in their safety leadership with an expert coach to boost safety performance. Begin your journey at execsafety coach.com. Come back in two weeks for the next episode with your host, Eric Michrowski. This podcast is powered by Propulo Consulting.

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Scott Kartvedt was the Navy’s first Commanding Officer of the only F-35C Stealth Strike Fighter Squadron in the US inventory, Strike Fighter Squadron ONE ZERO ONE, based in Eglin AFB, Florida. He also commanded a F/A-18 Hornet squadron during two combat deployments to Afghanistan in Support of Operation ENDURING FREEDOM. While leading the 250 Sailors of VFA-83, the unit was awarded the 2009 Commander Naval Air Forces Aviation Battle Efficiency Award, the CAPT Michael J. Estocin Award as the Navy’s Strike Fighter Squadron of the Year, and the 2010 CNO Safety Award.

Scott is currently a professional pilot and on the Board of Directors for the Blue Angel Foundation. He is an instructor and evaluator for United Airlines in Denver, Colorado, the number 5 pilot for the Patriot Jet Team, the only civilian jet demonstration team in North America, and was a stunt pilot in TOPGUN Maverick. He is also the Founding Partner of High-Performance Climb, a privately held consulting company. Scott shares his executive leadership, risk management, and safety mitigation experience gained during extensive combat operations through Inspirational Keynotes and workshops with clients worldwide.

For more information:

https://scottkartvedt.com

https://www.blueangelsfoundation.org

Scott “Intake” Kartvedt Book Cover

STAY CONNECTED

RELATED EPISODE

EXECUTIVE SAFETY COACHING

Like every successful athlete, top leaders continuously invest in their Safety Leadership with an expert coach to boost safety performance.

Safety Leadership coaching has been limited, expensive, and exclusive for too long.

As part of Propulo Consulting’s subscription-based executive membership, our coaching partnership is tailored for top business executives that are motivated to improve safety leadership and commitment.
Unlock your full potential with the only Executive Safety Coaching for Ops & HSE leaders available on the market.
Explore your journey with Executive Safety Coaching at https://www.execsafetycoach.com.
Executive Safety Coaching_Propulo

Lessons from Aviation Safety: Human Factors with Dr. Suzanne Kearns

The Safety Guru_Dr. Suzanne Kearns_Lessons from Aviation Safety Human Factors

LISTEN TO THE EPISODE: 

ABOUT THE EPISODE

To err is human… but to learn and identify organizational ‘accidents waiting to happen’ is more critical than to lay blame. In this episode, we’re TAKING OFF with Dr. Suzanne Kearns, Associate Professor of Aviation at the University of Waterloo and Founding Director at the Waterloo Institute for Sustainable Aeronautics. Suzanne shares critical insights in Human Factors and Aviation Safety to help listeners explore how aviation has been able to make substantial improvements in safety and, more importantly, how it can help every organization improve safety performance.

READ THIS EPISODE

Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people. First, great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is the safety Guru with your host, Eric Michrowski, a globally recognized Ops on safety guru, public speaker and author Are you ready to leave a safety legacy? Your legacy success story begins now.  

Hi, and welcome to the Safety Guru today. I’m very excited to have with me, Dr. Suzanne Kearns. She’s a professor of aviation, safety training, methodologies and human factors of the University of Waterloo, where she explores a lot of topics around human limitations and how they contribute to accidents and incidents. Former airplane and helicopter pilot, she’s also the founding director of the Waterloo Institute for Sustainable Aeronautics, which we’ll talk about shortly. So first, Suzanne, tell me a little bit about your story, a background and your passion for safety. 

Thank you, Eric. Well, it’s a pleasure, first of all, to be here. And I can tell you that I didn’t have dreams or aspirations when I was young of being a professor. I just loved aviation. I think I was fascinated by flight in general. I sort of still do think it’s magical, even though I do understand the science behind it. It’s just something captivating about it. And so, I grew up flying airplanes and helicopters. Starting when I was 15. I with helicopters North Bay, Ontario, doing some really fun flying in the Bush, where he actually used chainsaws and build your own landing pads and quite rugged. 

And then at that time, because in Canada, piloting hasn’t always been a university level discipline. It’s more so a college level discipline. And I just finished a college diploma. I was looking for University education. So, I actually went down to Ambria Aeronautical University in Daytona Beach, Florida, and finished my bachelor’s degree. And then at the end of that, two really big, life altering things happened. A colleague on campus was really tragically killed in a training accident and simultaneously, within a matter of months, 911 happened really shook the foundation of who I was and my dreams and the idea that the industry that you love and that you find inspiration and excitement and passion and is used to cause so much pain and devastation and widespread hurt around the world. 

It really did cause me to rethink what my path would be. And so, I went back and I earned my master’s degree in human factors, which is kind of like the science of why people make mistakes. I came back to Canada, and so that’s when I started as a professor and earned my PhD in education. 

Excellent. Well, thank you for coming here to share a little bit about your story and some of the backgrounds. And I think one of the pieces I’d like to touch on first is really the linkage between some of the principles of safety that we talk about in the aviation world versus what happens on the ground, because I think in many ways the aviation space is probably the most advanced when it comes to safety and really understanding the human and natural limitations that we have, for sure. 

Yeah. Well, I could tell you a little bit of a history of how we’ve gotten to where we are today in aviation with our understanding of safety. And I think what’s important to understand, if you look back to the 70s and 80s, there was this culture where if a pilot was to make a mistake and survive an accident, then they would be fired and removed from the industry. And I think that everybody’s responses finger pointing at that person, how could they have possibly made such a terrible mistake? 

Which, of course, has devastating impacts because not only is there the pain in the accident, but there’s also that individual is going to be experiencing a lot of traumas from that, because what we learned over time was that number one, in the late 1970s and early 80s, there were a series of very high-profile aviation accidents that were primarily caused by pilot error. And it really challenged the industry to say, how is this possible? How could such an intense network of intelligent, dedicated people make such obvious mistakes? 

Eastern Airlines, like, for one, is probably the make an obvious example. 

Yeah. Which is just a faulty light bulb which caused so much focus of attention. They didn’t realize they disengaged the autopilot and flew their aircraft into the ground. And so, these kinds of things challenge the industry. So, what happened was a really amazing example of government, academia, and industry coming together to say, what can we do about this? And they created the first human factors training program, which they called now it’s called Pro Resource Management training, or CRM, meant to teach pilots about human limitations. But that’s only one part of it, because that still puts, I think, a lot of the focus of blame on the individual, and it doesn’t ask broader organizational questions around.

Is it really that person’s fault if they have faulty equipment or didn’t receive training or they have been on a schedule that’s impossible, and any human would be tired or exhausted? So, it also shifted at the same time. So, we have human factors. But we also have the organizational approach to safety. What this does, it looks at the entire organization from the top all the way to the bottom and making sure that everybody is in identifying areas of risk and eliminating them before an accident happens. 

So, it’s not just about the end pilot user. It’s about everybody that contributed to that accident or that flight on a particular day. And I think there’s a lot of parallels and a lot of learnings that come of that space that could definitely be translated into a lot of other environments. I know you’ve done some work on some ground safety, I believe on the main inside of aviation. What are some of the parallels that you saw when you were translating principles from human factors to workers on the ground that could be exposed to hazards? 

Absolutely. Well, I think what is very universally true is that we’re all human beings. And so, the same types of limitations that one experiences as a pilot or a maintenance engineer, an airside worker. These are all the same basic issues because they’re all about our natural bodies and our minds. So, when I’m explaining this to my students, I always say if somebody makes a mistake and you pull that person out and put any other random person in with the same types of background and experience, if that new person, it’s feasible that they might make that same mistake, then we really need to question if it’s fair for us to be blaming all of our focus on that individual. 

We really need to look at the task of the environment and the situation. But what I did find in translating it is that you have to articulate this gets such an emotionally impactful and sometimes challenging issue, because if you don’t articulate it correctly, it sounds like you’re questioning a person’s competency or questioning their commitment to their job when in reality, what you’re just saying is we’re all people and our limitations can be scientifically predicted and tracked. So why don’t we learn all of that information and take it in before it leads to an accident? 

But it does require us to make sure what that core message, that it’s basically being wrapped in something that is true to the job rule, and that is using the right language and examples to that role. 

That makes a lot of sense. So, tell me about the importance of education when it comes to safety. 

Yeah. Well, I’m a big education, not the focus of my world is in trying to support the next generation and trying to teach them as best that I can to support the future of our industry. So that being said, as much as I love teaching, and I think some of my most exciting and powerful experiences professionally have been in classrooms as a teacher. That being said, education is not always the best way to eliminate risk in an organization that from a human factor’s perspective. If you change the task that they’re doing, if you change the equipment that they’re using operational environments and noise, temperature, distractions, a lot of those things are, I think, universally easier ways to eliminate risk. 

And sometimes I think it falls back to where we’re using education as a default that it’s too challenging or expensive to change some of those bigger structures of a trip. And so, we try to solve a problem by throwing a few hours of training at the problem. But I think it really does offload some of that responsibility to the workers. And I think we have to question and always be really careful. Is that ethical? And is that fair, or are we really putting our priority on the appearance that we’ve done something rather than investing our best effort to actually reduce that risk? 

I think that’s an important the hierarchy of controls, really in terms of eliminating the hazard that could be present as opposed to trying to train the individual to manage it. 

Yes. Exactly. And the reality, we know from a human factor’s perspective, that training is one of the tools in your toolbox that you can use to support big organizational change and improvement to make things safer. But it’s not the only thing. And sometimes it’s the more expensive and the one that has more substantial ongoing costs over a longer period of time. You can imagine, for example, if we’re looking at cars, we all know that texting and driving is very dangerous, that nobody should ever do that. But if we’re teaching a person like how much energy and effort has gone into teaching teenagers don’t do this right. 

The so dangerous you should never do this. But if there was a way where the cell phone itself could just disengage while it’s in a car, for example, then that equipment shift eliminates at risk. Right. And then something quite simple where it has, like a sleeping widespread. Obviously, there’s other implications in that example, but around. But I think it’s a much more effective way to eliminate the risk of that one situation rather than putting the emphasis on the people and through training. 

This is similar to a lot of Ops is in cars that will typically stop working or stop allowing to be managed or controlled when the vehicle is in motion. You could do the exact same thing with a texting device. 

Exactly. And I think, of course, it’s a simple example. But if you think of the parallels to aviation, I think it’s still very true that it’s such a heavily regulated industry. And so, we’re always trying to provide the evidence that’s required to the regulators. I’ve completed 5 hours of training on this. You’ve demonstrated that you’ve taken that action. But I’ve had some really interesting talks with international regulators at the highest level around this hour’s metric for training, because in aviation is always based on hours. Or I get to do 5 hours of this or 10 hours of this. 

And I said, why hours? Because everyone knows it’s not the hours that make an impact. It’s what happens during that time. It’s the experience and learning. And he said, okay, I’ll tell you a secret. We know that. But he said the reality is put yourself in my shoes. If I’m an international regulator and our safety board has identified some sort of a safety deficiency, he said, that the most obvious and direct thing that I can do is to throw a few hours of training at the problem because it shows that we’ve made this effort to address it. 

But he said, even I know that it’s not going to fully eliminate that risk for me. That was mind blowing, because some of you love aviation, like you grow up aligning everything under these regulations, and when you come to learn that they’re made by people as well, and people are perfect and just doing the best they can under challenging situations, then it does allow you to really refocus, and I think question whether there’s an opportunity to do things even better. 

Great. So, some of the topics you talked about earlier on when you’re talking about human factors was around care resource management and how that got cascaded for listeners that aren’t familiar with crew resource management. And maybe some of the elements in terms of how human factors can get trained or taught to pilots. Can you give maybe a bit of a highlight as to what the core principles are? So, people can think maybe about how we could translate to on the ground examples? 

Yeah. Absolutely. So, crew research management, as it is today, is required annual training for almost all pilots in the civilian world, and it has a few core components that includes things like workload management. So, in our world is fly the plane first, so Ava, then navigate, then communicate. So, it’s this task prioritization. So, workload management is number one. Situation awareness is number two, and situation awareness is sort of like if you’re in your operational setting, it’s your mental picture of everything around you. And people may be shocked. 

But one of the most common categories of accidents is called controlled flight into train. So, it’s flying a perfectly good airplane into the ground, which is a result of a lack of situation awareness. And that’s a very big one as well. Communication and crew coordination. So how you talk to and use the resources around you, including the technology, but also all the people in the aircraft and air traffic controllers and other supporters on the ground. So those are some of the big categories, but it’s based on a very robust and deep interdisciplinary field of research, which maybe doesn’t mean a lot to people. 

But I can tell you when I’m teaching human factors, I don’t teach, like a list of memorizations. So new pilots will learn something called the I’m safe checklist where before they go flying, they should do illness, medications, stress, alcohol, fatigue and eating kind of checks like, am I okay? All these categories. So that’s what most pilots know at the very beginning. But when I teach it at the University, it’s sort of a foundation of your natural human limitation. So, it’s some psychology and thinking. How much information can you reasonably be expected to retain at any one point in time? 

And when is the any person going to start making mistakes? It’s your senses so all of your senses, how you take in that information and how it could be tricked and distorted and how you can’t always trust your senses. It’s anthropometry, which is the measurement of the human body, because in an aircraft, all of the controls have to be within a certain level of reach of humans. And it’s the limitations of work. So, when anybody in the world would expect it to be come tired and start making mistakes, whether it’s due to a lack of sleep or just a prolonged period of mental or physical work. 

And there’s also we get into some issues around things like mental health and substance abuse, because those are also very human things that affect all of us in our population. There’s a lot of other factors I’m probably missing, but that’s kind of how we build it up the foundational building blocks. And if I have the students take away one thing, it’s that its air is human, that you shouldn’t expect people to never make mistakes. It should be the exact opposite. You should expect that it’s 100% normal for even the most competent professionals to make mistakes. 

And if you start from that foundation, then you can build up to say, where are those mistakes most likely to happen? And how can I manage to capture them before they may have an impact that costs for. 

This episode of The Safety Guru podcast is brought to you by Propulo Consulting. The leading safety and safety culture advisory firm. Whether you are looking to access your safety culture to develop strategies to level up your safety performance, introduce human performance capabilities, re energize your BBS program, enhanced supervisory safety capabilities, or introduce unique safety leadership training and talent solution. Propulo has you covered. Visit us at Propulo.com. 

That’s excellent. So, tell me a little bit about you talked about human factors. How does a contrast to safety management system as it pertains to the aviation the world? 

Sure. Yeah. So, this is probably one of the biggest confusions in the aviation world is that even career professionals sometimes don’t know the difference between human factors and safety management. So human factors are kind of like, like I said already, it’s a scientific discipline and why people make mistakes. So, a little bit of psychology, economics, physiology. So, all of these sorts of scientific foundation, it’s all of education. That’s a really big turn as well. And then that leads up to CEER resource management, which is where we teach operators about all of these limitations and give them some strategies about how to avoid them and how to work together and how to avoid error. 

And that’s sort of in one category. And then the second category is its organizational factors associated with safety. So, most people in aviation most commonly noticed through what we call reasons. Model reasons. Model has layers of protection. So, there’s, like, five squares you can imagine, but then each one of them has holes in it, which you can think of as layers of Swiss cheese. So, each layer has its own holes and each layer represents it from the highest-level senior management. And then as you work forward, it’s not training manager. 

And then the far piece is the actual operators like the phone. And the concept is that those holes in the layers represent latent failures. So, they’re like accidents waiting to happen, whether it’s management not investing up in training or a maintenance engine who has a poor practice where they’re making a mistake over and over or whatever it happens to be there these opportunities for accidents. And then it’s only when holes and all those layers and line up perfectly that an accident happens. So, the concept is that the accident itself is actually quite rare. 

Instead of focusing all of our attention on the accident, which is what we had starkly done, fire the crew, it doesn’t address all those holes. Those risks are still in that organization. So, the concept of safety management systems at its core is the identification and illumination of those latent failures before they have an opportunity to line up and cause an accident. So, it’s a proactive rather than reactive approach to aviation safety. 

Alright. So essentially reducing the probability of those holes lining up. 

Yeah. And human factors play in because human factors can create those holes through the whole system. So that’s one of the ways we can reduce those holes. But human factors can’t address everything, because like I said, if there’s, like high level management, managerial decisions that are affecting every part of the operation and equipment, then no matter how hard a pilot at the end tries to do the best, tries to be as professional as safe as possible. They don’t have control over those other factors, or they will be influenced by them regardless. 

Excellent. So, we love to talk a little bit about the work you’ve done at Waterloo, which is really interesting. As I mentioned on the front end, you’re the founding director of the Waterloo Institute for Sustainable Aeronautics. Tell me a little bit about what this Institute does and what are you trying to accomplish with all of this linked experience that you’re bringing together? 

Thank you. Well, I’m really excited because our Institute is we’ll be launching on October 5, 2021, and it’s the product of years of work. But what really led to the Institute for me was when the pandemic hits and in my field, like so many others, so many people are out of work, alumni, friends and colleagues, just tremendous devastation. And when I saw so many of my colleagues who were 100% focused on just survival of other organizations through pandemic, I started questioning, what can I do to support this sector that I care about? 

I’m an academic. I was in university. I can’t necessarily impact business decisions, but I kept questioning what could be of value that I could contribute during this time. And reflecting on the big challenges that aviation was facing before the Pandemic, which I sort of defined as widespread personnel shortages on international scale, the growing environmental emissions. So, if you remember the right Greta Thunberg like Shaming movement was really just growing when the Pandemic hit, as well as the rapid evolution of technology. And when you think of those three things, they really aligned with the three pillars of sustainability is social, environmental and economic sustainability. 

And at the same time, I also saw now my other big aviation universities around the world were actively recruiting in areas where Waterloo has tremendous world leading experts already. So, they’re looking for AI experts in cyber security and everything in between. And it really hit me that we could have a tremendous impact in supporting the sector to set aviation and aerospace up for a more sustainable future after the Pandemic. If I could mobilize the strength that we’re already at the University and sort of direct the powerhouse that is Waterloo towards these big challenges that I know we’re having a direct impact on the people in the industry that I cared about. 

So that’s how wise it came to be and really what it is now we have about 40 to 45 different professors as well as their labs and grad students. We have a really distinguished advisory committee with an honorary advisor of Commander Chris Hadfield and some amazing advisors internationally and some industry partners who are coming on board. And really what it’s meant to do is to form a bridge between aviation and aerospace and the University. So, if industry partners have challenges or problems, they can work with university professors to address those. 

And the beautiful part of that is in the process, they’re educating graduate students who then go on and support Canada’s, a knowledge economy, then become future leaders. So, we’re just getting started. But we’ve got a lot of excitement about what comes next. 

And when we talked earlier, you mentioned some of the examples of a linking experience. You talked about some examples of bringing engineering to the table machine learning. Can you give maybe some of the examples of how these themes come together and the power that it brings? 

Yeah. So, what’s different I think about universities as opposed to industry, is that industry problems are very multi-dimensional. Lots of different skill sets come together to create that problem. But then, when you compare it to a university environment, professors are incredibly high levels of expertise in a very narrow area, and they live within pockets of their own discipline. So, you have psychology in one pocket and engineering and another and health or science in another, for example. And then what I think is what we’re hoping to do with the Institute was to break down those silos and allow a connection between the different disciplines on campus. 

So as an example, I’m working with a few colleagues right now, and we’re looking at how pilots are trained because as I mentioned earlier, there’s sort of distinct personnel shortages projected internationally. And so how do we address that? So, I have one colleague to in cognitive psychology looking at the process of how people taking information and learn. Another who is in kinesiology who looks at hand-eye coordination and the development of those skills. Another who’s in optometry. So, she’s looking at how your eyes move across your environment and taking that information, and whether that can come together to be an indication of expertise. 

And another who is in engineering who looks at it’s a form of machine learning artificial intelligence. So, if you could take all those data points into a computer, basically, could the computer, then when a new person comes in and flies automatically assess their performance and automatically tell them where they’re strong and where they need to focus more to improve their skills by comparing to a big database of others. And I think the really exciting part of that is if we were able to do that effectively, you can then justify to the regular is moving more training out of aircraft flying in the real world and simulators, and that has distinct environmental benefits so that it’s far less emissions. 

And it’s also saving young people money because training becomes far more customized to what they need. So, they’re paying for less hours. And simulators are usually cheaper than the airplane. So, you’re hitting the economic, you’re hitting the social improvement and the environmental improvement when you see this beautiful, magical mix of all these disciplines coming together to address a problem that’s excellent. 

And I think when I see that when I hear your story, what you trying to drive is this multi-disciplinary view really can be used in other sectors, in other environments to really start meshing different levels of expertise to address safety challenges across the board 100%. 

And I think safety is such a perfect illustration of this because just like education, it’s not one thing. It’s not one discipline, right? So, you can’t create a perfectly safe system. Only looking at from the perspective of psychology, it feels like it’s almost like you have pieces of the puzzle and that’s one important piece of the puzzle. But until you identify and link those other pieces together, then you don’t have the full picture. 

I think that’s an incredibly important message in teams of that multidisciplinary view of things to drive things forward. Suzanne, thank you so much for the work that you’re driving around safety and the aviation space. As somebody who used to fly a lot. I appreciate that part as well, but more importantly, really coming to our podcast to share your story, share some ideas. I think there’s some really great examples, illustrations that people can take from what’s being done in the aviation space to translate it to really that learning organization thinking about as humans. 

Where are we going to make a mistake? Because it’s bound to happen. So, I really appreciate you coming to share a story. 

Thank you very much. 

Thank you for listening to the Safety Guru on C-suite radio. Leave a legacy. Distinguish yourself from the pack. Grow your success. Capture the hearts and minds of your teams. Fuel your future. Come back in two weeks for the next episode or listen to our sister show with the Ops guru. Eric Michrowski. 

The Safety Guru with Eric Michrowski

More Episodes: https://thesafetyculture.guru/

C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/

Powered By Propulo Consulting: https://propulo.com/

Eric Michrowski: https://ericmichrowski.com

ABOUT THE GUEST

Dr. Suzanne Kearns is an Associate Professor of Aviation at the University of Waterloo. She is an internationally recognized leader in aviation education research, earned airplane and helicopter pilot licenses at the age of 17, advanced aeronautical degrees from Embry-Riddle Aeronautical University and began working as an aviation professor upon graduation at the age of 24. In the 16 years since, she has taught and mentored thousands of aviation students, and written/co-authored six books printed in multiple translations (including Competency-Based Education in Aviation, Fundamentals of International Aviation, and Engaging the Next Generation of Aviation Professionals). She has received several awards for research and educational works, frequently delivers invited keynote addresses at international conferences, and holds leadership positions with several international aviation organizations. In 2021 she founded the Waterloo Institute for Sustainable Aeronautics (WISA), which she leads as its Director.

STAY CONNECTED

RELATED EPISODE