So close and yet so far… It’s the time of term where my coffee cup is empty for the third time, it’s 2am, and my fingers are protesting at the thought of another paragraph.
This time though, I’m almost relishing it, because at the conclusion of this blog post, the completion of just one assignment is standing in between me and graduation.
The assignment I completed 10 minutes ago was to do with Mount Erebus, my favourite accident (if it’s not too off to call an event where 279 people die a ‘favourite’ anything). It’s a case study full of mysterious optical illusions, conspiracy, betrayal, far-off places, miscommunication, shady characters, a family torn apart, and at least one very well-spoken fellow who isn’t afraid to call another fellow’s work an orchestrated litany of lies. I honestly don’t know why someone hasn’t made a fantastic drama/action movie out of it yet.
The sad part of it is that while all these dramatic and interesting things happened, there was a much more pervasive, subtle and dangerous contributor to the Mt Erebus disaster, and it was the organisational and safety culture within Air New Zealand at the time. I used the Human Factors Analysis and Classification System (HFACS) to have a look at the disaster and it seems that the organisational climate tended towards poor communication and little responsibility for actions, and the organisational processes were poorly designed, documented and coordinated.
I’m not sure how these general attitudes started, but they are something that generally manifest as a top-down culture. One thing leads to another: an important detail is miscommunicated because everyone should know what’s going on… the computer doesn’t alert anybody to an error… there’s odd weather that is conducive to whiteout… and BAM, plane meets mountain.
Before I started this course, I thought of human factors as a much smaller category than it is. It includes so much, as most things come down to people and their reactions, decisions, designs and work. I dove into the first assignment and got carried away in finding out all these new ideas and tools and studies… you know you’re somewhat of a geek when you’re up past midnight getting excited about theories of distributed cognition analysis.
I even had a bit of an ‘a-ha’ moment when my brain opened up to something I knew in theory but never really ‘got’ or understood how to find in real life: the theory behind HFACS suggests that something has to happen at every level – there’s apparently no such thing as an unsafe act without an organisational influence that somehow connects to it, and there are tools around to help investigators find that link and put their finger on exactly what it is. Having such an in-depth look at HFACS did however remind me to carefully evaluate and know the limitations and gaps in the tools that I use, because so far, nothing is perfect.
Going into the future as a professional following this course (and this program), I will remember to look at the people in the equations, and not just in the sense of ‘did they do their job correctly or not’, but to look at their circumstances, training, rosters, management, culture and more. Even further than that, I will remember that having humans able to make choices in a system can be a priceless, efficient, good quality which could save the day, as well as being able to make choices that can produce accidents.
But before I can think too much about being a professional, I have to finish this program… I guess I’ll post when I know what it feels like to be one of the very first Bachelor of Accident Forensics graduates!