It isn't the automation, it is the Pilot Monitoring, CRM and Culture

Seggy

Well-Known Member
I noticed that a few folks in the Southwest thread were talking about how the FAA is now going to require us to only shoot instrument approaches in wake of the Southwest (and Atlas) aircraft landing at the wrong airport over the last few months. I know that this was 'just' talk, but I want to point out a few things.

Before these aircraft landed at the wrong airport, we saw Colgan, Air France, and Asiana, get their aircraft into different stalls. It is interesting that we are seeing a 'grouping' of extremely similar causes of accidents/incidents in a relatively short period of time. Is that a coincidence? I don't think so.

Out of the Colgan and Air France accidents a big push was made to be less reliant on the automation and 'just fly' the airplane. Now we saw two incidents were aircraft landed at the wrong airport, in visual conditions, conditions that may lead us to 'just fly' the aircraft. I am not sure how the automation was set up in the Atlas and Southwest cockpits, but if the FAA wants to 'push' us to only do instrument approaches, that is in direct contradiction to what they just put out concerning the Colgan and Air France lessons learned. Of course you back up the visual approaches with your automation, but your eyes need to be more outside of the cockpit if you are 'just flying' the aircraft. You cross check with what you have inside, but in the last month even that cross check has failed twice....in a big way.

So do we need more or less automation? Does the FAA need to micromanage how we operate the airplane? If so how do they do it? Or is it something else?

Looking at the Colgan and Air France accident reports it is clear that if there was a lack of a strong pilot monitoring (PM) and a breakdown in CRM. I am not familiar with the culture at Air France or Atlas, but look at the culture of Southwest, Asiana, and Colgan at the time of their accidents/incidents. It doesn't paint a very favorable picture. Southwest has a culture of rushing, Asiana a culture of not speaking up, and Colgan a vindictive culture if one made any type of mistake.

See the similarities here? It has very little to do with the actual mechanics of operating the airplane. It has everything to do with how the crew interacted with themselves and the situation they were in.

The human factors folks at the NTSB and FAA need to focus on the roles of the PM, CRM, and actual culture of the companies when they investigate these incidents/accidents. They can't get caught up in micromanaging how we get the aircraft on the ground. Whether it be a visual or instrument approach, honestly, that really isn't that important. If you have strong pilot monitors, great CRM, and great cultures, getting the aircraft on the ground safely will happen VERY well.
 
The human factors folks at the NTSB and FAA need to focus on the roles of the PM, CRM, and actual culture of the companies when they investigate these incidents/accidents. They can't get caught up in micromanaging how we get the aircraft on the ground. Whether it be a visual or instrument approach, honestly, that really isn't that important. If you have strong pilot monitors, great CRM, and great cultures, getting the aircraft on the ground safely will happen VERY well.

Then the question becomes, what builds strong pilot monitors, great CRM practices, and great cultures?
 
What about the CAL 737 runway excursion in DEN?

On July 13, 2010 the NTSB published that the probable cause of this accident was the captain's cessation of right rudder input, which was needed to maintain directional control of the airplane, about 4 seconds before the excursion, when the airplane encountered a strong and gusty crosswind that exceeded the captain's training and experience

http://en.wikipedia.org/wiki/Continental_Airlines_Flight_1404
 
Last edited:
I already said it.

Proper CRM and threat and error training which emphasizes the role of the pilot monitoring and a commitment to a just safety culture.

As in, how would this be proposed to X airline, and how would they implement that training (example syllabus/process maybe).

What are the threats? What are the probable errors? How are PM duties to be emphasized? What are the pilots responsibilities for commitment to a just safety culture?
 
As in, how would this be proposed to X airline, and how would they implement that training (example syllabus/process maybe).

You have human factor experts come in and with management work to develop a training syllabus that would work for that airline.

What are the threats? What are the probable errors? How are PM duties to be emphasized? What are the pilots responsibilities for commitment to a just safety culture?

You talk about all of this in the training.
 
Focusing on the PM is second to focusing on the PF. A close second, but second nonetheless.

I posted earlier a question, how many single-pilot non-crew aircraft have had wrong (unintended) airport landings, that we know of? I don't know the answer offhand, or even if record can be or is kept.

What I do know is, that in these incidents of wrong airport landings, the PF was fairly thoroughly convinced he/she was headed to the correct airport. At least not possessing enough doubt at the time to have gone around before touching down. Why is this? Insofar as the PM, we need to know whether the PM was:

1. Convinced of the surety of the airport they were headed to, just like the PF.
2. Not sure of whether the airport in question was correct, but was convinced in some way along the way, enough to ride with the PF to touchdown, or
3. Was never sure if the airport was correct, but didn't speak out or otherwise make his lack of surety known

While the PM is important, I would look at the PF first, and that is why I bring up the single-pilot example. In the cases of a crew, is that PM helping, neutral, or sucking SA from the PF? What was convincing the PF that he was headed to the correct place? Would said PF have committed the same error had he been flying single pilot with no PM input, positive or negative?

I believe that while the outcome of these wrong airport landings was the same (landing at the wrong airport), the factors leading to each one will probably be somewhat different, once the three questions above are answered.
 
@MikeD, those questions really don't make sense in the crew environment as the PF isn't always the person 'in charge'. Hence why a focus on a strong CRM concept and good company culture is a better way to look at it.
 
You have human factor experts come in and with management work to develop a training syllabus that would work for that airline.

You talk about all of this in the training.

You can't give some examples from experience?
 
As in, how would this be proposed to X airline, and how would they implement that training (example syllabus/process maybe).

US Airways was forced to do exactly this after they experienced a series of hull losses in a short time frame.

I only know this because I benefited from the outcome as we started the E170 program at the last shop.

It really changed, for the better, the foundation of duties, task management in normal and abnormal situations decreasing task saturation and decreasing the resultant errors, either correcting an abnormal or in a task saturated environment.

What are the threats? What are the probable errors? How are PM duties to be emphasized? What are the pilots responsibilities for commitment to a just safety culture?

The first three are answered by what @Seggy suggested in his post. Quite frankly, the details of which are way beyond the scope of what can be covered in an internet thread. However, if you have any friends at airways, grab a copy of their normal and abnormal policies and procedures, operating philosophy, automation philosophy and checklist instructions.

To answer your final question, the pilots have to buy in, just like SOPs. It's not worth the paper it's written on if no one uses it in the real world. See: "How we do it online versus in the SIM"
 
@MikeD, those questions really don't make sense in the crew environment as the PF isn't always the person 'in charge'. Hence why a focus on a strong CRM concept and good company culture is a better way to look at it.

Of course. But the PF is the one taking the plane to where it's going; hence the Pilot Flying. So those questions do make sense in terms of role, not in terms of responsibility necessarily.

Why is the PF committing his error of commission? Why is the PM committing his error of omission?

Doesn't matter who is in charge necessarily, it matters who is driving the train, and who should be backing that person up at the time.
 
Back
Top