David Brooks, in his Friday column in The New York Times ("Drilling For Certainty") highlights six important ideas that engineers should think about - -
- People have trouble imagining how small failings can combine to lead to catastrophic disasters. Typically it is the interplay between minor events that lead to unanticipated systemic crashes.
- People have a tendency to get acclimated to risk. When things seemingly are going well, people unconsciously adjust their definition of acceptable risk (Two weeks prior to the Deepwater Horizon accident and after 40-years without a major offshore drilling release, we were all set as a nation for a major expansion in offshore drilling).
- People have a tendency to place elaborate faith in backup systems and safety devices. More pedestrians die in crosswalks than when jay-walking. We end up with a false sense of security.
- We have a tendency to match complicated technical systems with complicated governing structures. Does anyone understand the management structure on the Deepwater Horizon during actual drilling operations or the management structure and responsibilities associated with the response efforts?
- People tend to spread good news and hide bad news. A culture of silence can settle upon front-line workers who don't want to lose their jobs to executives who don't want to hurt profits.
- People in the same field begin to think alike, whether they are in oversight roles or not. Group think begins to creep into the decision making process at alarming levels.
A future mandate for the engineering community needs to focus on a better understanding of the linkages and interfaces among the mechanical world, complexity in the context of catastrophic events, and human psychology.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.