flyover
New Member
Doug Taylor said:So whenever the check airmen says, "Using your systems knowledge, work out this problem" he's actually doing you a favor helping you think outside of the box.
Of course using system knowledge and thinking outside of the box got a lot of people in a lot of trouble in the past. Which is why nowadays the emphasis is on flying the airplane, following procedure, isolating the problem and flying the airplane.
On another thread: Putting the go/no-go decision into the hands of automation is not someting I'm interested in. Don't misinterpret automation as this magic box which has the big picture which knows how to fly an airplane and the percentages of risk involved with aborting or continuing the takeoff roll.
Of course that's exactly what magic boxes do best. The whole history of aborts and outcomes can by analyzed and the best course of action deduced for a given scenario. All with no time or emotional pressure. Then when the box recognizes the scenario the correct response comes out while the pilot is mid "what the.......".
Automation is a box programmed by engineers which may or may not understand the dynamics of flight and will execute responses programed by engineers which may or may not understand.
Again, it's a false argument if you talk about the failings of automated systems and not the dismal record of line pilots in this area. And I'm not faulting the pilots. It's impossible to make a rational decision in the split-second you have on an abort. Not enough time, not enough information, not a fast enough processor and inevitably emotion comes in. The good news is this one can be pretty easily solved by future technology.
Not only is this old technology, very old, but it's partially French. So not a good example.Remember the Airbus crash? Why would a plane fly so low and no want to land? Adding power or increasing back pressure isn't consistent with an airplane in the landing phase so lets go ahead and override the pilot's response consistent with a safe landing.
WHAP!
Or you're accelerating through about 70 knots and there's a flock of Canadian geese not paying attention and an antelope traversing the runway during takeoff roll in a DC-9. Do you continue? Do you abort? What would the automation do with what it can't see? What if the automation *could* see it, what would it do? If you hit the animals, had an engine fire and aborted, would it command evacuation? Or does the captain determine if it requires evacuation?
Don't see any problems with designing systems to handle all those scenarios. Certainly no problem with automation seeing obstructions ahead (see self-driving car link above). Are you saying captains make consistently correct evacuation decisions? I don't think the data would support that conclusion.
Relax Doug. This is about after your career and after I'm dead. So you can take some solace in that. If I'm wrong be kind, remember I'll be dead.