DropTank
Well-Known Member
Oh boy guys...alright we'll do this. I meant that the automation causes accidents that probably wouldn't have occurred otherwise. You know, based on the context of what we're talking about. Like plowing straight ahead into a pedestrian or zigging when it should have zagged and turning into a big fireball. And before you ask, yes I'm using hyperbole. But it seems like something like this happens more frequently than it should. So now scale that up to aviation and you've got issues. It might work perfectly 99% of the time, but as always it's that pesky 1% that's the issue. How often does your autopilot do something wacky and you just shut it off and handfly? I'd say once a month for me at least. Works great until it doesn't, and then it's human intervention time. I know we're all excited to live in that Star Trek & Jetsons future but again, the existence of the technology is a big difference from it being ready for practical application in the real world. This is a puff-piece for Ameriflight and Natilus designed to get people buzzing and generate a couple days of morning news segments. Look how long it took the FAA to warm up to GPS for navigation. You think this is right around the corner?
Look rookie, you acting like your paragraph has no amazing new revelation on information that only you have.
To explain this to you, apparently I'll have to explain automation, statistics, and very basic engineering just to get to the point where you're just laughably wrong.
Move along, you're out of your league.