RCU Forums - View Single Post - Ama should have left faa alone!
View Single Post
Old 04-07-2015, 04:33 PM
  #106  
804
Senior Member
My Feedback: (1)
 
Join Date: Sep 2005
Location: sheridan, IN
Posts: 1,167
Likes: 0
Received 0 Likes on 0 Posts
Default

Originally Posted by franklin_m
Don,

Excuse the long post. But since you asked what I'd do, I wanted to provide a full and complete answer.

Every successful safety management system (i.e. "programming") I've seen, whether personal safety, process safety, or aviation safety, has included several key similarities.

First, a person with formal safety training that reports directly to the organizational head on safety matters. In industry, they tend to be CSP's, in the military they are graduates of formal aviation safety schools (my experience).

Second, unambiguous operational and safety rules. I'm not talking wishy washing stuff like setbacks that are zero to some number (and waiverable at very low levels of the organization), I'm talking about genuine risk managed hard distances that are not easily waiverable. Yes, waivers exist, but they're rare and approved over someone's signature – an act that puts someone on paper as being accountable for the decision. Amazing how that alone changes how people think about safety.

Third, the organizations use leading vice lagging metrics. Lagging metrics are injuries and insurance payments. Leading metrics would be non-injury mishaps, near misses, equipment failures, rule violations that don't result in mishaps, etc. Perhaps focused tracking of crashes of certain aircraft types (large/fast for example). These allow you to do trend analysis as well.

Fourth, accountability. The effective programs I've seen or been part of ensure that enforcement is firm, fair, and - most importantly - consistent. I'm not saying you crush people for minor violations, but you do document and track them – all of them. Why? Those are called leading metrics! The reality is that absent accountability for not following rules, the culture develops what is called “Normalization of Deviance.” Simply put, it means the organization develops a culture that views rules like ice cream, easily melted. The distinction between big rules and small rules becomes blurred, and worse yet that distinction is in the eye of the beholder. It's chaos. If you want to see what happens in cultures like that, research the Texas City Refinery explosion, Shuttle Challenger, Shuttle Columbia, Three Mile Island, Chernobyl, and a host of others.

Fifth, a vibrant and honest safety communication system. The strongest cultures are very good at admitting mistakes, including those painful “pilot error” events. Only by being honest with ourselves can we really drive our risks down. It probably means fewer safety articles in MA written by English majors and more with hard technical content and first person stories about lessons learned.

Now, I'm sure there are some that will read this and react immediately to what they perceive as unrealistic, draconian, too many “rules”, etc. Each is entitled to their own opinion. While what I've described above sounds complex, it's really not all that onerous. It's highly scalable and very easy to implement – if the organization has the will to do it.

My interest in this is that sooner or later we're going to have an encounter between a passenger carrying aircraft or an injury to a spectator that makes the news unlike what we've seen before. As much as we like to think otherwise, there are some AMA members that not nearly so disciplined about following AMA rules as the commenters here. When that happens, what the media and the regulators will dig into the details of the “programming” and discover there's a lot of writing, few hard limits, and little if any enforcement. Furthermore, they'll find that we have no data to prove we're as good as we say. Sure, some will argue that the absence of prior incidents shows we're safe, unfortunately with modern safety theory, those interested in extracting money from AMA or using the incident to shut down flying sites will merely say we were lucky.

On the other hand, if we have a more professional safety management system ("programming") with the features I've described above, we'd like prevent such an event in the first place by "trapping" the rogue AMA member's lesser violations, and if it's not an AMA member, we could point to such a system as proof of our safety, with mounds of data to back it up -- thus preserving all of our ability to fly.
Or, how to turn a fun, safe hobby into a mass of finger-pointing,
safety-nazi-programmed drones and kill it from within.
No thanks-I'd rather let die its own natural death.