AF447 Vanity Fair article

Actually, he missed many aspects of the factual history of the accident. Several of his "facts" are actually wrong. He read it looking for a conclusion, kind of like many do with religious texts. He has some good information and is an excellent writer, but I do not think his conclusions were accurate. He missed it.
Expand please.
 
Actually, he missed many aspects of the factual history of the accident. Several of his "facts" are actually wrong. He read it looking for a conclusion, kind of like many do with religious texts. He has some good information and is an excellent writer, but I do not think his conclusions were accurate. He missed it.

What's inaccurate in the article?
 
Chirp.....Chirp......

Come on, @seagull , you can't drop a bomb like this without explaining........

Of course you could be trolling and I just bit the hook.......time will tell
 
I'll second killbilly. Whatever it is that he saw we all missed has always been worth my time to wait for.
My comment was not meant to be an attack on @seagull or anyone else for that matter. Perhaps I was a bit overzealous in search of more information......:oops:
 
My comment was not meant to be an attack on @seagull. Perhaps I was a bit overzealous in search of more information......:oops:
Well... I will say this, and I don't know the guy personally, it's not like him to leave us hanging when he knows we've all missed something.

So have a bunch of other guys I've talked to with experience in bus and boeing.
 
I seem to remember @seagull posting in one of the many af447 threads a long and well thought out post that presented a bit of a different view about that accident, but either my search skills or my memory has let me down.
Personally, I wasn't as impressed with that article everyone else is. A bit too melodramatic, and definitely a one sided perspective on the causes and factors of that accident. But, hey it's Vanity Fair and not an NTSB report.
 
Wasn't just the NTSB, it was both Boeing and Airbus. In Boeings case it was before the A300 accident. AA were warned and did not listen. Not to say the there isn't a lot of good info in the AAMP program, because there is. It's just they did not understand what the implication were for some of what they taught, specifically full opposite rudder deflections.
I'm not sure that anyone outside those who have read and really understand, and can apply Part 25 (read: engineers) really understood that.

I do believe this was one of the guys the NTSB came down on for the AAMP. 90 degree flip in sim sessions caused by wake turbulence + rudder for recovery applies for F4 Phanthoms. Not A300s.
Nevertheless. "I am so sorry, we made you this way."
 
FWIW, seagull has a link in his (her?) "signature" with a long article on their blog as to their opinion regarding the accident. You have to scroll down as ways, or use crtl-F and type in 447 to find it.
 
FWIW, seagull has a link in his (her?) "signature" with a long article on their blog as to their opinion regarding the accident. You have to scroll down as ways, or use crtl-F and type in 447 to find it.
TY, did not realize that.
 
Shem said:
Posted on May 6, 2014by Shem Malmquist
The author shot this photo flying inverted over Hong Kong

“The problem with pilots today is that they’re not doing enough hand flying!” How often has this been said in recent years? It is a refrain that seems to be the current “hypothesis du jour” by many in the aviation industry in an effort to explain many of the incidents and accidents that have occurred. It appears that the entire industry, regulators and the various alphabet organizations have jumped onto this bandwagon. ”Advocate more hand flying” and safety will be improved. Closely following this, and a bit curiously, is the concept that all pilots need to do is “pay attention to detail” more, that if the accident or incident is not a result of “stick and rudder” skills, it is a consequence of just not paying attention. Has anyone stopped to think about these things?

Stick and Rudder

I have a bias. As can be seen in the photo I took over Hong Kong (above) I love hand flying. I have a love/hate relationship with autopilots and autothrottles, in that they can be great but it is not nearly as fun for me. If conditions allow for it, I will hand-fly the airplane up to FL250 (25,000 feet) and I also turn off the automation fairly early during the arrival. I do it because flying the airplane is fun! There is no question that hand flying is good, and the more you do the better you will get at it. However, how does this relate to the recent air carrier accidents?

A review of all of the accidents involving jet transports in the last five years fails to find any that would clearly have been averted if the pilots had done more hand-flying. If the hand-flying aspect was present, it was not in isolation. There are several where the aspect of hand-flying has been raised, but even those are missing the point. Most recently in the news highlighting the call for “more hand-flying” have been Air France (AF) 447 and Asiana 214 in San Francisco. However, a review of AF 447 and the preliminary data on Asiana, with the removal of hindsight bias, makes the “easy fix” of more hand-flying much less clear.

Contrary to popular opinion, and based on the data that has been provided, the accidents themselves do not appear to be a lack of hand-flying skills per se. In other words, the accident could have happened to someone who handles the aircraft exceptionally well. What I am seeing in these accidents is not a lack of flying skills, but rather a lack of understanding what the automation is providing for them. In other words, their expectations of the automation are different than what the automation will actually do. This aspect is not trained as well. While the pilots are trained in the basics of the automation, there are enough gaps in the training that the “What’s it doing now” comment (typically stated in a humorous way, but meant seriously), is very common.

If a pilot does not know why the automation is doing something there is (of course) the possibility that the computer is doing something it should not be doing. In reality that is a very low probability — the rate of actual problems with the hardware and/or software is very, very low. It is much more probable that the pilot does not understand the automation. Therefore, a statement (or even a thought) along the lines of “What’s it doing now?” should serve as a red flag that the pilot does not fully understand the automated system. Perhaps they are operating at too high a level of automation for the situation. Regardless, if the disconnect continues for more than a moment or two, they should seriously consider moving to a less automated mode until things are sorted out.



At all times, the pilot should be certain what to expect next. Unfortunately, absent significant system knowledge, most of the current systems installed in aircraft do not do a very good job of telegraphing to the flight crew what it is planning on doing after the current action. Sometimes it is not a large factor, such as the way the aircraft adjusts to a different-than-expected wind model when it transitions from one phase of flight to another, but other times it can be a major factor that can lead to, at the very least, a violation of a regulation or at worst, an accident when a crew expected the aircraft to capture a certain vertical path or airspeed.

Pilot Knowledge

Putting the automation aside, let’s take a look at the hand-flying aspects. In all the advocating of hand-flying there has not been much of an effort to ensure that all pilots have reviewed or received recurrent training on the fundamental aerodynamics. You may find that surprising. You might ask “Don’t pilots get trained for that early on?” Well, the answer to that is a definite “maybe”. While most pilots might get it at some point early in their careers, there are some that have only learned enough to pass the written test, with their formal training being generally fairly superficial. Sure, they know how the aircraft responds to their inputs and a rudimentary concept of “why”, but that leaves out the concepts of what would happen in the more rare “corner-points” that are often the scenario where accidents occur. Regardless of their background, a comprehensive review of aerodynamics should be included in the training programs. It is not expected that pilot skills will stay sharp in hand-flying or the flight procedures absent regular training, so why is a review of aerodynamics left out of the equation?

While it is only rarely that a comprehensive baseline understanding of aerodynamics is important, when it is important, it can be very important! In addition, if we are to expect pilots to use their “hand-flying” skills to get out of sticky situations, perhaps they should be allowed to practice those skills in situations where they might need them. AF 447’s crew found themselves hand-flying the aircraft at an altitude regime where the full time use of the autopilot is required, in a degrade flight control mode rarely practiced or demonstrated at any altitude.

The combination of the altitude and flight control regimes was something that they would never have seen in any training, even assuming the simulator could adequately replicate the conditions (a topic unto itself!). Is it reasonable to assume that they would be able to have the skills to hand-fly the airplane with those issues, while flying at night through the upper levels of turbulent thunderstorms? It is easy to judge them in hindsight, but the scenario was far different than the comfort of judging their actions after the fact from an office chair.



Attention to Detail

Thinking more on the combination of advocating hand-flying and “attention to detail”, one has to ask, “How can you pay attention to detail if you are hand-flying?” The industry originally moved towards the idea of more automation to allow for better decision making with fewer pilots in the cockpit. The concept was that we could operate very large aircraft (which used to require crews of three or more pilots) with just two pilots, as the automation would free up the pilots from all that hand-flying, so they could concentrate on decision making. In that process, there is no question that the pendulum went pretty far to the point of mandating high levels of automation use, and that needed to be pulled back, but it now appears that we are trying to go the opposite way, while admonishing “ok, hand-fly, but you’re still just as responsible to monitor!”

More to the point, though, we simply cannot just command somebody to pay attention more. In the Just Culture algorithm, we divide actions that led to an adverse outcome intoerror, at-risk and reckless. We define an error as something that is an unintentional act, and, by definition, if that leads to a bad outcome, it is a “system problem.” By “system”, we mean that we need to redesign the policies, procedures, equipment, displays, etc. to fix it. At-risk is an action that is a result of a person intentionally not following a policy or procedure, but doing it to work-around something in order to facilitate getting the job done. This is a combination of an individual and system issue, but also still largely a system problem, as the system was not designed correctly if someone is having to “bend the rules” to get the job done. Reckless is just a disregard of the rules for no justifiable reason.

Coming back to this issue of “attention to detail”, where would this fall in the Just Culture matrix? It would be hard to argue that any lapse due to lack of attention to detail would constitute an intentional act to just disregard procedures. Perhaps in some cases (but not many) it could fall into the at-risk area, but most of the time, these would clearly be anerror. The fix for errors is not to just tell someone “don’t do that!” I am reminded of theBob Newhart video (linked here) where he plays the role of a psychologist who tells his patient to “just stop it” when she is expressing problems. If we are to correct these issues it will be through system and procedural design, not just telling someone to “stop it”.

The “system” should be designed with the knowledge of human error and cognitive limitations by people who have an expertise in the field. Too many times we see people put in positions and given titles with no real expertise. A person who has a combination of formal academic training in human factors, coupled with experience in the field, would be ideal, as that person could integrate the academic knowledge with the “real world” to truly create systems that could capture error. As an alternative, two people, one with academic knowledge, and one with extensive operational experience, could be coupled together to attack these types of problems.

Relying on people who do not possess the knowledge and experience is how we created these flawed systems in the first place. We, as an industry, can do better.

Easier for some to read.
 
The conclusions the author arrives at are also supported by the Asiana airlines accident @ KSFO. The situations are eerily similar.:eek2:
 
I will come back to the specific errors as I don't have time to write it out at the moment, but will say that his reporting of the timing of events, the amount of control input and when, are at the very least misleading. The reporting of the salience of various cues is not accurate. If you read the actual report and look at the actual data from the DFDR and CVR you will see it. His characterization of the crew interactions is not accurate, as well as the implication that there was something wrong with when the Capt chose to go back (easy to see in hindsight, but not a surprising decision in reality), and the "inexperience" of the F/O, who had more flying time than many RJ captains out there.

In general, though, the premise of this article is wrong as it was not really an issue of a "weak pilot", but more an issue of mental models, high gain control response (startle, relatively never trained alternate control law and no familiarity with handling qualities at that altitude with low q-factors), plus a lot of emotional aspects. The latter was due to inexperience, but the stick and rudder skills were not a factor here, per se. It is apparent that nobody recognized the aerodynamic stall conditions. Much more experienced crews than this have fallen to a similar fate.

One thing to keep in mind is that from the time of autopilot disconnect to hitting the water was just 4 minutes. Not a lot of time to sort out a number of conflicting clues and undo a flawed mental model (confirmation and expectation bias). Again, read my article.

http://airlinesafety.wordpress.com/2014/04/21/the-role-of-cognitive-bias-in-aircraft-accidents/

It is easy to view this through the lens of "hindsight bias". Errors are easy to see in retrospect. The combination of factors was a bad one. Proper training in the full envelope of the aircraft in all flight control modes and altitude regimes would have likely prevented this. So would have better crew training in airborne weather radar. Startle response training would also have been beneficial.

I found it interesting that the most significant factor in this accident, confirmation bias, is also what led to Langewiesche ignoring so many of the facts to come up with the simple "weak pilot caused it" approach in the article itself.
 
Back
Top