How do you know your aviation organization is safe? Experts say answering this question is actually a fool’s errand.
“Not only is this an impossible question to answer,” said Dr. Tony Kern, CEO of Convergent Performance, “but it’s possibly a dangerous question to ask.”
Kern explained that an aviation organization can never really know if it’s truly safe. In fact, claiming to know that you’re safe might actually indicate that there’s a safety problem. One characteristic of a highly reliable or safe organization is a preoccupation with failure. To say you’re safe essentially means you’re “done,” which is the antithesis of continuous improvement and safety.
Culture of Professionalism
Jeff Wofford, an aviation director in the Southeast, said safety isn’t a “thing,” in and of itself. It’s the result of many qualities and actions. How do we add up those qualities and actions to equal safety?
“Safety is determined by the culture that exists within your organization,” said Wofford. “It’s the psychological and systematic approach to all operations, not just flying the airplane.”
Many operators point to their safety management system (SMS), quality management system (QMS) or third-party audit credentials as proof of their safety. While these tools are valuable, Kern said people often focus on the wrong pillars, and he recommends that leaders focus on safety promotion.
“The people are the system,” declared Kern. “When you get lost in process, you are almost, by definition, losing focus on the people.”
Kern said rewards and recognition for even the smallest positive actions – such as a person asking for a second opinion, conducting a thorough walkaround, or pushing a flight back by an hour in order to take off in better weather – is important.
“Recognize these things to really expand safety promotion. That’s the link between an SMS and realizing that the people are the process,” said Kern.
Cultures are made up of mini-cultures. Each flight crew, each ground handling shift and every scheduling team is its own mini-culture.
Promote the Positive, Root Out the Negative
Consider the Gulfstream GIV accident at Massachusetts’ Bedford-Hanscom Field in 2014. The pilots, who had flown together for more than 10 years and had a combined 29,000 hours of flight time, failed to remove the gust lock, then failed to perform rejected-takeoff procedures in a timely manner, resulting in all seven people on board perishing.
Flight data recorder and cockpit voice recorder information, reviewed as part of the NTSB’s investigation into the accident, revealed that the pilots habitually cut corners in their preflight checks. In fact, the pilots conducted preflight control checks on only two of 175 flights reviewed.
The NTSB found that the “flight crew’s omission of a flight control check before the accident takeoff indicates intentional, habitual noncompliance with standard operating procedures.”
The flight department operating the aircraft held an IS-BAO Stage 2 recognition. Then-NTSB board member Robert Sumwalt cited the old axiom, “You can fool the auditors, but you can never fool yourself.”
“Cultures need to be able to promote the positive, but also need to root out the negative, regardless of their stature or experience level,” said Kern.
Safety can – as evidenced by the 2014 GIV accident – be as simple as using a checklist on every flight. Always performing this small task would reduce the number of accidents by a tremendous factor, Wofford said, adding that these small steps are really just pieces of professionalism.
“Professionalism is simply doing things the right way, not falling victim to shortcuts or pressures,” said Wofford. “It’s a mindset. It means conducting yourself with integrity.”
NBAA’s Safety Committee has identified professionalism as a focus area of any SMS, describing aviation professionalism as “the pursuit of excellence through discipline, ethical behavior and continuous improvement.” To learn more, see nbaa.org/professionalism.
Safety Manager as Leader
Experts say top leadership must value the role of safety manager, ensuring it’s a long-term, consistent effort, regardless of who is in the position. The person serving as safety manager must consistently push the message that the people are the system. This creates a self-sustaining and self-improving system.
Safety managers must have appropriate training and be comfortable managing up. They also need to know what safety means in their organization and should understand that safety is a fiduciary responsibility.
Wofford said while some organizations rotate the person in the safety manager position or assign these tasks to the “new guy,” a safety manager should be someone who is trained and is interested in doing this job.
Safety managers must have the authority to go to top leadership without conferring with the director of aviation and should also have the authority to reward – and even discipline – when necessary.
“You have to check your ego at the door as head of the organization, but at the same token, the safety manager has to be someone you can trust,” added Wofford. “If the safety manager is a ‘manager’ in title only, you’re just wasting your time.”
In addition to ensuring your safety manager has the authority to complete their tasks effectively, top management, or the accountable executive, must actually participate in the organization’s safety efforts.
“The accountable executive must show sincere and genuine concern about safety risk management, not just in ‘being safe’ and avoiding accidents,” said Sonnie Bates, CEO of Wyvern, Ltd.
Evaluating Your Safety
Kern said there is a very simple way to evaluate your organization’s culture – talk to new people. Today’s high turnover and growth rates are a huge opportunity to instill safety into your company’s culture and evaluate its health.
Kern suggested top leadership use this little trick:
Say to all new employees, “Sometime in your first six months you’re going to see something that’s not quite right, whether an instance of noncompliance, FOD that someone walks past, or something similar. These things happen everywhere. When you see it, how you respond to that will tell you whether you will be successful here and whether we will be a successful organization.
“This is so important to me, that when you see it, I want you to come back and tell me, ‘I remember what you told me – and I saw it.’ I don’t need names or details. I just want to know you saw it.”
If the leader hasn’t heard from new employees in six months, Kern recommended that the leader go back to each person and ask, “Did you forget what I said, or did you turn a blind eye?”
Kern called this “the indoctrinate and vaccinate method.”
Bates suggested accountable executives should not ask the safety manager and other employees, “Are we safe?“ Instead, they should ask: “How are we managing risk? What can we do to improve?“
“We get so focused on formal SMSes and audit standards that we don’t look at just operating safer,” said Wofford. He said leaders should consider this thought, attributed to Dr. Amy Grub, a clinical psychologist with the FBI: “Culture is the story that an organization’s employees tell about the organization.”
“We need to focus more on the culture of an organization,” declared Wofford, “and we’ll get safety as one of the results.”
Safety Toolkit Essentials
An SMS or flight operations quality assurance (FOQA) program and other safety initiatives are tools used to help create a safe organization, but they are not, in and of themselves, evidence of safety.
Certainly, a strong SMS lays out the policies and procedures that can lead to a safe culture, and one key policy that feeds a safety culture is a hazard reporting policy.
“Top leadership needs to make it abundantly clear that this is a human endeavor and mistakes will be made,” said Sonnie Bates, CEO of Wyvern, Ltd. “But the accountable executive has to insist that errors and deviations be reported.”
A reporting culture must also include a “just culture,” which ensures there is no punitive action or retribution against those who report errors or safety hazards.
“A just culture leads to intelligence that allows an organization to make informed decisions,” said Bates. “Those decisions are made within your organization’s risk tolerance.”
Company procedures should also require careful analysis of the reported hazards. Bates explained that human instinct can lead to skipping the analysis and going straight to a corrective action. He warned that this approach creates a new risk.
“Without the analysis, we determine a corrective action based on biases,” Bates explained, suggesting instead that organizations should use a team approach to risk assessment. This can minimize the influence of bias.
Bates added that a common misconception in business aviation is that if you use a flight risk assessment tool (FRAT) before a flight, you’re safe. In reality, a FRAT is just one tool within an SMS – it is not a guarantee of safety.