I don’t like risk matrices.
I have written blog posts, given speeches, and written comments on several draft consensus standards pointing out the flaws in using risk matrices in EHS decision-making. I continue to be frustrated by the insistence some registration auditors place on having them – even though there is no requirement in either ISO 14001 or OHSAS 18001 mandating their use.
Just last month, a registration auditor expressed his disapproval of a client’s aspect evaluation procedure because, as he put it,
“Where are the ranking numbers?”
So I was quite pleased when I stumbled upon the following YouTube video that set out in an explicit and graphic way why the majority of risk matrices are flawed.
There are three common problems associated with using a risk matrix –
1. As set out in this video, most numeric-based risk ranking tables are not based on a valid statistical approach and result in a biased analysis of the potential risks associated with the items being analyzed. Many times, the results do not even pass a “common sense” test when they are reviewed after the number-crunching is complete (i.e. “Does this result make sense based on what we know about our operation?”).
2. There is the temptation to use a risk matrix simply because there is insufficient information to do “a real analysis.” Rather that developing real data, numbers are simply assigned to educated guesses. The inevitable result is GIGO (Garbage In = Garbage Out).
3. Risk ranking tables are used to compare items that can’t be directly compared.
So, can a risk matrix ever be used?
Sometimes, if certain conditions are met.
Want to find out more, click here to read what they are in the latest EHS Management System Update Newsletter – Apples or Oranges – Which is Better?
Want to subscribe to this newsletter – use the sign-up box below.
© ENLAR Compliance Services, Inc. (2013)