After some recent dialogue with colleagues, but with some caution, I am returning to aspects of EOD Psychology. I have spoken at length to some leading medical doctors on how medical diagnostics are made prior and during complex surgery and I have continued to devour what I think might be relevant literature.
Let me explain my caution first. When I trained as an EOD operator I received no instruction in dealing with cognitive biases, whether they be my own or others with whom I engaged with. When I commanded an EOD unit I had no real concept or understanding of cognitive biases displayed by my teams. But I think I could tell a good operator from a bad one, and retrospectively I think it was those who were best able to make decisions under stress who stood out. Without realising it I think I was identfying those who had techniques for coping with cognitive biases. Poor operators were ones whose cognitive biases overwhelmed them to a point of confusion.
In the (many!) years since I moved on from operational duties I have worked with a significant number of bomb squads and EOD operators around the world. I have also studied, on an amateur level, aspects of psychology that I felt were relevant. I have written some posts about this activity on this website before – you can find them by following the “EOD Psychology” tab on the sidebar to the right.
For what it is worth, I still consider myself very much an amateur in this field. But some of the lessons I have learned apply not only in the EOD world but in broader life, business, especially in complex projects.
One post that got some interesting personal feedback was the identification of techniques that EOD Operators could use to “force” them past cognitive biases. I proposed the use of a what I called a pre-mortem technique to force a more objective analytical approach in certain planned EOD operations. So, under some pressure to come up with more, here’s a second technique which may have some utility. I’m thick skinned so if you think this is nonsense, let me know. I’m fairly certain that at its worst, it can do no harm…. here goes.
I think that lessons can be learned from most EOD operations, but that most EOD operators are intrinsically poor at learning those lessons, due to cognitive biases. EOD operators (and frankly this applies in many other fields) are humans who need to force themselves to better identify “decision quality” from “outcome quality” and clearly differentiate between the two.
So, to give this context, ask yourself this question – In your last period of operational activity which was the operation where you made the best decision? Think hard on that now before reading on….
Now… I’m willing to bet that many of you are now thinking about an operation that went well, as a result. But here is your mistake – you are probably thinking about the “outcome” of your decision not the decision itself. It is really tricky to identify decision quality subjectively. But I genuinely think it is a skill that one can learn and also dare I say comes with age (I’m making a case for grey beards here!). So here is the technique I propose that will not interrupt operational activity but in after-action thinking might help you train your brain to think more about decision quality :
After every operation have a think and identify the best decision you made on that operation and the worst. Try to do that consistently. It works well for major business projects too, I think. You will probably find it tricky to start with, and only identify trivial decisions, but it will come as you “force” your brain to address its biases. After a while you will start to identify those decisions you make that have a “quality” that is perhaps unrelated to to the outcome quality. You will build a personal awareness about those decisions you find easy and those decisions you find tricky. Self awareness is the key. You might start to see pattern. I hope you will, and you can use your consequent undertsanding to make more better decisions and less poor decisions, notwithstanding the outcome of the operation. A “good enough” operation is not one where all your decisions will be satisfactory – use the opportunity! I would even recommend including these questions in post operational reports with a specific box for each. I would recommend instructors on training courses ask these of their students after a training task. I think it will encourage self awareness, encourage a focus on decision making, and might even provide help to others in your unit.
Good luck. Tell me if it is nonsense. I welcome dialogue eitehr driectly at the email address top right or through the on line comment section.
On a different point, I was talking to a well respected neuro-surgeon about decision making and he recommended two books on the subject – I was gratified that the two books he mentioned were ones I have found very helpful in thinking about this subject. So on the basis of his recommendation, not just mine, here they are:
1. Thinking Fast and Slow, By Daniel Kahneman
2. Sources of Power – How people make decisions, by Gary Klein
This article is reprinted with permission form the author Roger Davies.
For more great reading go to Standing Well Back