Lessons from Aviation Safety: Human Factors with Dr. Suzanne Kearns
LISTEN TO THE EPISODE:
ABOUT THE EPISODE
To err is human… but to learn and identify organizational ‘accidents waiting to happen’ is more critical than to lay blame. In this episode, we’re TAKING OFF with Dr. Suzanne Kearns, Associate Professor of Aviation at the University of Waterloo and Founding Director at the Waterloo Institute for Sustainable Aeronautics. Suzanne shares critical insights in Human Factors and Aviation Safety to help listeners explore how aviation has been able to make substantial improvements in safety and, more importantly, how it can help every organization improve safety performance.
READ THIS EPISODE
Real leaders leave a legacy. They capture the hearts and minds of their teams. Their origin story puts the safety and well-being of their people. First, great companies ubiquitously have safe yet productive operations. For those companies, safety is an investment, not a cost for the C-suite. It’s a real topic of daily focus. This is the safety Guru with your host, Eric Michrowski, a globally recognized Ops on safety guru, public speaker and author Are you ready to leave a safety legacy? Your legacy success story begins now.
Hi, and welcome to the Safety Guru today. I’m very excited to have with me, Dr. Suzanne Kearns. She’s a professor of aviation, safety training, methodologies and human factors of the University of Waterloo, where she explores a lot of topics around human limitations and how they contribute to accidents and incidents. Former airplane and helicopter pilot, she’s also the founding director of the Waterloo Institute for Sustainable Aeronautics, which we’ll talk about shortly. So first, Suzanne, tell me a little bit about your story, a background and your passion for safety.
Thank you, Eric. Well, it’s a pleasure, first of all, to be here. And I can tell you that I didn’t have dreams or aspirations when I was young of being a professor. I just loved aviation. I think I was fascinated by flight in general. I sort of still do think it’s magical, even though I do understand the science behind it. It’s just something captivating about it. And so, I grew up flying airplanes and helicopters. Starting when I was 15. I with helicopters North Bay, Ontario, doing some really fun flying in the Bush, where he actually used chainsaws and build your own landing pads and quite rugged.
And then at that time, because in Canada, piloting hasn’t always been a university level discipline. It’s more so a college level discipline. And I just finished a college diploma. I was looking for University education. So, I actually went down to Ambria Aeronautical University in Daytona Beach, Florida, and finished my bachelor’s degree. And then at the end of that, two really big, life altering things happened. A colleague on campus was really tragically killed in a training accident and simultaneously, within a matter of months, 911 happened really shook the foundation of who I was and my dreams and the idea that the industry that you love and that you find inspiration and excitement and passion and is used to cause so much pain and devastation and widespread hurt around the world.
It really did cause me to rethink what my path would be. And so, I went back and I earned my master’s degree in human factors, which is kind of like the science of why people make mistakes. I came back to Canada, and so that’s when I started as a professor and earned my PhD in education.
Excellent. Well, thank you for coming here to share a little bit about your story and some of the backgrounds. And I think one of the pieces I’d like to touch on first is really the linkage between some of the principles of safety that we talk about in the aviation world versus what happens on the ground, because I think in many ways the aviation space is probably the most advanced when it comes to safety and really understanding the human and natural limitations that we have, for sure.
Yeah. Well, I could tell you a little bit of a history of how we’ve gotten to where we are today in aviation with our understanding of safety. And I think what’s important to understand, if you look back to the 70s and 80s, there was this culture where if a pilot was to make a mistake and survive an accident, then they would be fired and removed from the industry. And I think that everybody’s responses finger pointing at that person, how could they have possibly made such a terrible mistake?
Which, of course, has devastating impacts because not only is there the pain in the accident, but there’s also that individual is going to be experiencing a lot of traumas from that, because what we learned over time was that number one, in the late 1970s and early 80s, there were a series of very high-profile aviation accidents that were primarily caused by pilot error. And it really challenged the industry to say, how is this possible? How could such an intense network of intelligent, dedicated people make such obvious mistakes?
Eastern Airlines, like, for one, is probably the make an obvious example.
Yeah. Which is just a faulty light bulb which caused so much focus of attention. They didn’t realize they disengaged the autopilot and flew their aircraft into the ground. And so, these kinds of things challenge the industry. So, what happened was a really amazing example of government, academia, and industry coming together to say, what can we do about this? And they created the first human factors training program, which they called now it’s called Pro Resource Management training, or CRM, meant to teach pilots about human limitations. But that’s only one part of it, because that still puts, I think, a lot of the focus of blame on the individual, and it doesn’t ask broader organizational questions around.
Is it really that person’s fault if they have faulty equipment or didn’t receive training or they have been on a schedule that’s impossible, and any human would be tired or exhausted? So, it also shifted at the same time. So, we have human factors. But we also have the organizational approach to safety. What this does, it looks at the entire organization from the top all the way to the bottom and making sure that everybody is in identifying areas of risk and eliminating them before an accident happens.
So, it’s not just about the end pilot user. It’s about everybody that contributed to that accident or that flight on a particular day. And I think there’s a lot of parallels and a lot of learnings that come of that space that could definitely be translated into a lot of other environments. I know you’ve done some work on some ground safety, I believe on the main inside of aviation. What are some of the parallels that you saw when you were translating principles from human factors to workers on the ground that could be exposed to hazards?
Absolutely. Well, I think what is very universally true is that we’re all human beings. And so, the same types of limitations that one experiences as a pilot or a maintenance engineer, an airside worker. These are all the same basic issues because they’re all about our natural bodies and our minds. So, when I’m explaining this to my students, I always say if somebody makes a mistake and you pull that person out and put any other random person in with the same types of background and experience, if that new person, it’s feasible that they might make that same mistake, then we really need to question if it’s fair for us to be blaming all of our focus on that individual.
We really need to look at the task of the environment and the situation. But what I did find in translating it is that you have to articulate this gets such an emotionally impactful and sometimes challenging issue, because if you don’t articulate it correctly, it sounds like you’re questioning a person’s competency or questioning their commitment to their job when in reality, what you’re just saying is we’re all people and our limitations can be scientifically predicted and tracked. So why don’t we learn all of that information and take it in before it leads to an accident?
But it does require us to make sure what that core message, that it’s basically being wrapped in something that is true to the job rule, and that is using the right language and examples to that role.
That makes a lot of sense. So, tell me about the importance of education when it comes to safety.
Yeah. Well, I’m a big education, not the focus of my world is in trying to support the next generation and trying to teach them as best that I can to support the future of our industry. So that being said, as much as I love teaching, and I think some of my most exciting and powerful experiences professionally have been in classrooms as a teacher. That being said, education is not always the best way to eliminate risk in an organization that from a human factor’s perspective. If you change the task that they’re doing, if you change the equipment that they’re using operational environments and noise, temperature, distractions, a lot of those things are, I think, universally easier ways to eliminate risk.
And sometimes I think it falls back to where we’re using education as a default that it’s too challenging or expensive to change some of those bigger structures of a trip. And so, we try to solve a problem by throwing a few hours of training at the problem. But I think it really does offload some of that responsibility to the workers. And I think we have to question and always be really careful. Is that ethical? And is that fair, or are we really putting our priority on the appearance that we’ve done something rather than investing our best effort to actually reduce that risk?
I think that’s an important the hierarchy of controls, really in terms of eliminating the hazard that could be present as opposed to trying to train the individual to manage it.
Yes. Exactly. And the reality, we know from a human factor’s perspective, that training is one of the tools in your toolbox that you can use to support big organizational change and improvement to make things safer. But it’s not the only thing. And sometimes it’s the more expensive and the one that has more substantial ongoing costs over a longer period of time. You can imagine, for example, if we’re looking at cars, we all know that texting and driving is very dangerous, that nobody should ever do that. But if we’re teaching a person like how much energy and effort has gone into teaching teenagers don’t do this right.
The so dangerous you should never do this. But if there was a way where the cell phone itself could just disengage while it’s in a car, for example, then that equipment shift eliminates at risk. Right. And then something quite simple where it has, like a sleeping widespread. Obviously, there’s other implications in that example, but around. But I think it’s a much more effective way to eliminate the risk of that one situation rather than putting the emphasis on the people and through training.
This is similar to a lot of Ops is in cars that will typically stop working or stop allowing to be managed or controlled when the vehicle is in motion. You could do the exact same thing with a texting device.
Exactly. And I think, of course, it’s a simple example. But if you think of the parallels to aviation, I think it’s still very true that it’s such a heavily regulated industry. And so, we’re always trying to provide the evidence that’s required to the regulators. I’ve completed 5 hours of training on this. You’ve demonstrated that you’ve taken that action. But I’ve had some really interesting talks with international regulators at the highest level around this hour’s metric for training, because in aviation is always based on hours. Or I get to do 5 hours of this or 10 hours of this.
And I said, why hours? Because everyone knows it’s not the hours that make an impact. It’s what happens during that time. It’s the experience and learning. And he said, okay, I’ll tell you a secret. We know that. But he said the reality is put yourself in my shoes. If I’m an international regulator and our safety board has identified some sort of a safety deficiency, he said, that the most obvious and direct thing that I can do is to throw a few hours of training at the problem because it shows that we’ve made this effort to address it.
But he said, even I know that it’s not going to fully eliminate that risk for me. That was mind blowing, because some of you love aviation, like you grow up aligning everything under these regulations, and when you come to learn that they’re made by people as well, and people are perfect and just doing the best they can under challenging situations, then it does allow you to really refocus, and I think question whether there’s an opportunity to do things even better.
Great. So, some of the topics you talked about earlier on when you’re talking about human factors was around care resource management and how that got cascaded for listeners that aren’t familiar with crew resource management. And maybe some of the elements in terms of how human factors can get trained or taught to pilots. Can you give maybe a bit of a highlight as to what the core principles are? So, people can think maybe about how we could translate to on the ground examples?
Yeah. Absolutely. So, crew research management, as it is today, is required annual training for almost all pilots in the civilian world, and it has a few core components that includes things like workload management. So, in our world is fly the plane first, so Ava, then navigate, then communicate. So, it’s this task prioritization. So, workload management is number one. Situation awareness is number two, and situation awareness is sort of like if you’re in your operational setting, it’s your mental picture of everything around you. And people may be shocked.
But one of the most common categories of accidents is called controlled flight into train. So, it’s flying a perfectly good airplane into the ground, which is a result of a lack of situation awareness. And that’s a very big one as well. Communication and crew coordination. So how you talk to and use the resources around you, including the technology, but also all the people in the aircraft and air traffic controllers and other supporters on the ground. So those are some of the big categories, but it’s based on a very robust and deep interdisciplinary field of research, which maybe doesn’t mean a lot to people.
But I can tell you when I’m teaching human factors, I don’t teach, like a list of memorizations. So new pilots will learn something called the I’m safe checklist where before they go flying, they should do illness, medications, stress, alcohol, fatigue and eating kind of checks like, am I okay? All these categories. So that’s what most pilots know at the very beginning. But when I teach it at the University, it’s sort of a foundation of your natural human limitation. So, it’s some psychology and thinking. How much information can you reasonably be expected to retain at any one point in time?
And when is the any person going to start making mistakes? It’s your senses so all of your senses, how you take in that information and how it could be tricked and distorted and how you can’t always trust your senses. It’s anthropometry, which is the measurement of the human body, because in an aircraft, all of the controls have to be within a certain level of reach of humans. And it’s the limitations of work. So, when anybody in the world would expect it to be come tired and start making mistakes, whether it’s due to a lack of sleep or just a prolonged period of mental or physical work.
And there’s also we get into some issues around things like mental health and substance abuse, because those are also very human things that affect all of us in our population. There’s a lot of other factors I’m probably missing, but that’s kind of how we build it up the foundational building blocks. And if I have the students take away one thing, it’s that its air is human, that you shouldn’t expect people to never make mistakes. It should be the exact opposite. You should expect that it’s 100% normal for even the most competent professionals to make mistakes.
And if you start from that foundation, then you can build up to say, where are those mistakes most likely to happen? And how can I manage to capture them before they may have an impact that costs for.
This episode of The Safety Guru podcast is brought to you by Propulo Consulting. The leading safety and safety culture advisory firm. Whether you are looking to access your safety culture to develop strategies to level up your safety performance, introduce human performance capabilities, re energize your BBS program, enhanced supervisory safety capabilities, or introduce unique safety leadership training and talent solution. Propulo has you covered. Visit us at Propulo.com.
That’s excellent. So, tell me a little bit about you talked about human factors. How does a contrast to safety management system as it pertains to the aviation the world?
Sure. Yeah. So, this is probably one of the biggest confusions in the aviation world is that even career professionals sometimes don’t know the difference between human factors and safety management. So human factors are kind of like, like I said already, it’s a scientific discipline and why people make mistakes. So, a little bit of psychology, economics, physiology. So, all of these sorts of scientific foundation, it’s all of education. That’s a really big turn as well. And then that leads up to CEER resource management, which is where we teach operators about all of these limitations and give them some strategies about how to avoid them and how to work together and how to avoid error.
And that’s sort of in one category. And then the second category is its organizational factors associated with safety. So, most people in aviation most commonly noticed through what we call reasons. Model reasons. Model has layers of protection. So, there’s, like, five squares you can imagine, but then each one of them has holes in it, which you can think of as layers of Swiss cheese. So, each layer has its own holes and each layer represents it from the highest-level senior management. And then as you work forward, it’s not training manager.
And then the far piece is the actual operators like the phone. And the concept is that those holes in the layers represent latent failures. So, they’re like accidents waiting to happen, whether it’s management not investing up in training or a maintenance engine who has a poor practice where they’re making a mistake over and over or whatever it happens to be there these opportunities for accidents. And then it’s only when holes and all those layers and line up perfectly that an accident happens. So, the concept is that the accident itself is actually quite rare.
Instead of focusing all of our attention on the accident, which is what we had starkly done, fire the crew, it doesn’t address all those holes. Those risks are still in that organization. So, the concept of safety management systems at its core is the identification and illumination of those latent failures before they have an opportunity to line up and cause an accident. So, it’s a proactive rather than reactive approach to aviation safety.
Alright. So essentially reducing the probability of those holes lining up.
Yeah. And human factors play in because human factors can create those holes through the whole system. So that’s one of the ways we can reduce those holes. But human factors can’t address everything, because like I said, if there’s, like high level management, managerial decisions that are affecting every part of the operation and equipment, then no matter how hard a pilot at the end tries to do the best, tries to be as professional as safe as possible. They don’t have control over those other factors, or they will be influenced by them regardless.
Excellent. So, we love to talk a little bit about the work you’ve done at Waterloo, which is really interesting. As I mentioned on the front end, you’re the founding director of the Waterloo Institute for Sustainable Aeronautics. Tell me a little bit about what this Institute does and what are you trying to accomplish with all of this linked experience that you’re bringing together?
Thank you. Well, I’m really excited because our Institute is we’ll be launching on October 5, 2021, and it’s the product of years of work. But what really led to the Institute for me was when the pandemic hits and in my field, like so many others, so many people are out of work, alumni, friends and colleagues, just tremendous devastation. And when I saw so many of my colleagues who were 100% focused on just survival of other organizations through pandemic, I started questioning, what can I do to support this sector that I care about?
I’m an academic. I was in university. I can’t necessarily impact business decisions, but I kept questioning what could be of value that I could contribute during this time. And reflecting on the big challenges that aviation was facing before the Pandemic, which I sort of defined as widespread personnel shortages on international scale, the growing environmental emissions. So, if you remember the right Greta Thunberg like Shaming movement was really just growing when the Pandemic hit, as well as the rapid evolution of technology. And when you think of those three things, they really aligned with the three pillars of sustainability is social, environmental and economic sustainability.
And at the same time, I also saw now my other big aviation universities around the world were actively recruiting in areas where Waterloo has tremendous world leading experts already. So, they’re looking for AI experts in cyber security and everything in between. And it really hit me that we could have a tremendous impact in supporting the sector to set aviation and aerospace up for a more sustainable future after the Pandemic. If I could mobilize the strength that we’re already at the University and sort of direct the powerhouse that is Waterloo towards these big challenges that I know we’re having a direct impact on the people in the industry that I cared about.
So that’s how wise it came to be and really what it is now we have about 40 to 45 different professors as well as their labs and grad students. We have a really distinguished advisory committee with an honorary advisor of Commander Chris Hadfield and some amazing advisors internationally and some industry partners who are coming on board. And really what it’s meant to do is to form a bridge between aviation and aerospace and the University. So, if industry partners have challenges or problems, they can work with university professors to address those.
And the beautiful part of that is in the process, they’re educating graduate students who then go on and support Canada’s, a knowledge economy, then become future leaders. So, we’re just getting started. But we’ve got a lot of excitement about what comes next.
And when we talked earlier, you mentioned some of the examples of a linking experience. You talked about some examples of bringing engineering to the table machine learning. Can you give maybe some of the examples of how these themes come together and the power that it brings?
Yeah. So, what’s different I think about universities as opposed to industry, is that industry problems are very multi-dimensional. Lots of different skill sets come together to create that problem. But then, when you compare it to a university environment, professors are incredibly high levels of expertise in a very narrow area, and they live within pockets of their own discipline. So, you have psychology in one pocket and engineering and another and health or science in another, for example. And then what I think is what we’re hoping to do with the Institute was to break down those silos and allow a connection between the different disciplines on campus.
So as an example, I’m working with a few colleagues right now, and we’re looking at how pilots are trained because as I mentioned earlier, there’s sort of distinct personnel shortages projected internationally. And so how do we address that? So, I have one colleague to in cognitive psychology looking at the process of how people taking information and learn. Another who is in kinesiology who looks at hand-eye coordination and the development of those skills. Another who’s in optometry. So, she’s looking at how your eyes move across your environment and taking that information, and whether that can come together to be an indication of expertise.
And another who is in engineering who looks at it’s a form of machine learning artificial intelligence. So, if you could take all those data points into a computer, basically, could the computer, then when a new person comes in and flies automatically assess their performance and automatically tell them where they’re strong and where they need to focus more to improve their skills by comparing to a big database of others. And I think the really exciting part of that is if we were able to do that effectively, you can then justify to the regular is moving more training out of aircraft flying in the real world and simulators, and that has distinct environmental benefits so that it’s far less emissions.
And it’s also saving young people money because training becomes far more customized to what they need. So, they’re paying for less hours. And simulators are usually cheaper than the airplane. So, you’re hitting the economic, you’re hitting the social improvement and the environmental improvement when you see this beautiful, magical mix of all these disciplines coming together to address a problem that’s excellent.
And I think when I see that when I hear your story, what you trying to drive is this multi-disciplinary view really can be used in other sectors, in other environments to really start meshing different levels of expertise to address safety challenges across the board 100%.
And I think safety is such a perfect illustration of this because just like education, it’s not one thing. It’s not one discipline, right? So, you can’t create a perfectly safe system. Only looking at from the perspective of psychology, it feels like it’s almost like you have pieces of the puzzle and that’s one important piece of the puzzle. But until you identify and link those other pieces together, then you don’t have the full picture.
I think that’s an incredibly important message in teams of that multidisciplinary view of things to drive things forward. Suzanne, thank you so much for the work that you’re driving around safety and the aviation space. As somebody who used to fly a lot. I appreciate that part as well, but more importantly, really coming to our podcast to share your story, share some ideas. I think there’s some really great examples, illustrations that people can take from what’s being done in the aviation space to translate it to really that learning organization thinking about as humans.
Where are we going to make a mistake? Because it’s bound to happen. So, I really appreciate you coming to share a story.
Thank you very much.
Thank you for listening to the Safety Guru on C-suite radio. Leave a legacy. Distinguish yourself from the pack. Grow your success. Capture the hearts and minds of your teams. Fuel your future. Come back in two weeks for the next episode or listen to our sister show with the Ops guru. Eric Michrowski.
The Safety Guru with Eric Michrowski
More Episodes: https://thesafetyculture.guru/
C-Suite Radio: https://c-suitenetwork.com/radio/shows/the-safety-guru/
Powered By Propulo Consulting: https://propulo.com/
Eric Michrowski: https://ericmichrowski.com
ABOUT THE GUEST
Dr. Suzanne Kearns is an Associate Professor of Aviation at the University of Waterloo. She is an internationally recognized leader in aviation education research, earned airplane and helicopter pilot licenses at the age of 17, advanced aeronautical degrees from Embry-Riddle Aeronautical University and began working as an aviation professor upon graduation at the age of 24. In the 16 years since, she has taught and mentored thousands of aviation students, and written/co-authored six books printed in multiple translations (including Competency-Based Education in Aviation, Fundamentals of International Aviation, and Engaging the Next Generation of Aviation Professionals). She has received several awards for research and educational works, frequently delivers invited keynote addresses at international conferences, and holds leadership positions with several international aviation organizations. In 2021 she founded the Waterloo Institute for Sustainable Aeronautics (WISA), which she leads as its Director.