Zoryna O’Donnell takes a hard look at how decisions are taken in change.
“The only constant in life is change” said Heraclitus over two thousand years ago, yet we still often find change rather stressful and difficult to deal with.
This is because our brains are incredibly uncomfortable with ambiguity, uncertainty and change. They have evolved to operate like super-fast and energy-efficient “prediction machines” with a sole purpose of keeping us alive and safe, and they do it by relying on a plethora of shortcuts and patterns.
Admittedly, this modus operandi of the brain was helpful to our prehistoric ancestors living in a reasonably stable world alongside powerful predators who would love to devour them at the earliest opportunity, hence it stuck with us. However, it is less helpful in the modern VUCA world with its ever-increasing volatility, uncertainty, complexity and ambiguity, where unprecedented degree of flexibility and adaptability is required from organisations and individuals in order to succeed. This flexibility and adaptability has a high price tag in terms of rising stress levels and deficiencies in our decision-making – both at organisational and individual levels.
If you are interested to learn more about it, Hilary Scarlett, author of Neuroscience for Organizational Change, gives some useful insights into how our brains react to change (and how best to handle it during the times of organisational change) in her short video on Youtube.
Whether we are stressed or not, the ability of leaders to make sound decisions is essential for any organisation, and especially so during the periods of change. Yet decision-making during change can be particularly difficult because:
a) there is a need for swift decisions in a situation of uncertainty (a “threat state” neurologically);
b) biases have a constant impact on decision making (and, according to the research undertaken by Susan Fiske, leaders are more prone to biases and stereotyping than your average Joe Bloggs); and last but not the least,
c) emotions also have a significant impact on our decisions.
In his book Thinking, Fast and Slow, the Nobel Prize Winner Daniel Kahneman described the following two thinking and decision-making systems:
System 1 thinking
- Uses amygdala which, along with many other functions, processes emotions.
- Fast, automatic, reflexive.
- Information dealt with “below the surface” and without us being consciously aware of it (i.e. reacting).
- Not prone to doubt, suppresses ambiguity and relies on short cuts.
System 2 thinking
- Uses the lateral prefrontal cortex (LPFC) which is involved in many higher cognitive processes such as working memory, goals, planning and self-control.
- Slower, deliberate and takes more effort.
- Information dealt with consciously, by the way of deliberate thinking (i.e. responding).
- Prone to doubt and procrastination.
System 1 thinking and decision-making tends to take over during the stressful times of change, when our brains go into an “autopilot” mode with its unconscious shortcuts and biases.
Robert Cialdini in his book Influence: Science and Practice lists the following six shortcuts typically used by our brain as a guide to what is a good decision:
- Reciprocation (we have a strong sense that if someone does us a favour, we should repay this favour);
- Commitment and consistency (we are more likely to keep our commitments if we have put them in writing or if we have made those commitments in front of other people);
- Social proof (or “people who are like me” pattern – we often decide what is “correct” behaviour based on what other people do, particularly those who are like us);
- Liking and rapport (the more we like people, the more we are likely to say “yes” to them);
- Scarcity (opportunities and objects feel more desirable and valuable to us the less they are available);
- Expert and authority (if someone is perceived to be an expert, we are more likely to listen to their opinion; if someone has authority over us, we are more likely to comply).
We all need to be aware of these shortcuts and their potential implications for our decision-making, especially during the periods of organisational change.
Like shortcuts, biases (hidden beliefs) are automatic and we are not consciously aware of them. We are all biased but, as mentioned earlier, leaders are particularly prone to biases and, unfortunately, intelligence does not reduce bias. In fact, intelligence might make biases even more ingrained.
There are over 180 identified forms of cognitive bias – you can find all of them in the Cognitive Bias Codex. Some of these biases were developed for expediency while the others – for self-protection. We need to remember that, left unchecked, cognitive biases can affect both our thinking and actions. They can prompt us to use information from the wrong sources, seek to confirm our existing beliefs, or remember events not in the way they actually happened.
Main biases to be aware of during the periods of change include:
- Ingroup and Outgroup bias (“us” and “them” bias – we feel safe with people who we think are on our side and are suspicious and frightened by those who are not);
- Survivorship bias (our tendency to focus on the winners in a particular area and try to learn from them while completely forgetting about the losers who are employing the same strategy);
- Confirmation bias (“my side” bias – our tendency to search for and favour information that confirms our beliefs while simultaneously ignoring or devaluing information that contradicts our beliefs);
- Sunk-cost bias (“throwing good money after bad” – it is hard for us to give on something where we have already invested a lot of time, money and other resources);
- Temporal value discounting (“better £1 today than £2 next month” – the reward processes in the striatum part of our brains tend to discount the value of future rewards, that is why some “quick wins” are so important for the success of change programmes);
- Projection bias (our tendency to assume that people think and see the world in the same way as us);
- Anchoring effect (the first piece of information we are given has an impact on how we perceive all subsequent information);
- Illusion of control (we tend to overestimate our influence over external events);
- Loss aversion (our tendency to strongly prefer avoiding losses over acquiring gains);
- The Availability Heuristic (a common mistake that our brains make by assuming that the ideas and examples which come to mind easily are also the most important or prevalent things).
In her book Neuroscience for Organizational Change, Hilary Scarlett offers the following tips for overcoming bias:
- Build awareness of the fact that our brains lead us to be biased (and that there is no need to feel guilty about it);
- Don’t rely on self-monitoring of biases. Biases are largely unconscious, therefore we are more likely to notice biases in others than in ourselves;
- Be open to challenges and curious about different views and ways of thinking;
- Put plans in place to prevent biases from colouring decisions (clear decision-making guidelines will help);
- Beware of limitations of your own thinking and set aside some time for reflections;
- Biases tend to creep in when goals and processes are unclear, so set up clear and fair procedures to guide behaviour;
- Beware of your emotions but stick to the evidence;
- Prime yourself and other decision makers not to be biased;
- Don’t rely on self-monitoring of biases – appoint somebody to play “Devil’s Advocate” to challenge the assumptions that are being made.
And remember: decision-making takes its toll on our energy and self-control, so it worth considering:
- Whether or not “big” decisions are suitable for meetings, and limiting the number of “big” decisions on the agenda;
- Timing your decision-making meetings earlier in the working day;
- Potential trade-offs in terms of speed vs accuracy and whether decisions should be taken immediately or it would be helpful to “sleep on them”;
- Systems 1 and 2 thinking and whether decisions are being swayed by System 1;
- How you can mitigate the impact of decision-making shortcuts and biases.
Now, that you have read this article, you have a better chance to make sound decisions to aid change processes in your organisation.
For expert support please contact Zoryna [email protected]
Please get in touch or book a call. We’d love to chat.
Cialdini, R.B. (2001) Influence: Science and Practice, Allyn and Bacon, Needham Heights, MA
Fiske, S. T. (1993) Controlling other people: the impact of power on stereotyping, American Psychologist, 48 (6), pp 631-628.
Kahneman, D. (2012) Thinking, Fast and Slow, Penguin Books, London
Scarlett, H. (2016) Neuroscience for Organizational Change: An evidence-based practical guide to managing change, Regan Page, London-Philadelphia-New Delhi.
West, F.W., Meserve, R.J. and Stanovich, K.E. (2012) Cognitive sophistication does not attenuate the bias blind spot, Journal of Personality and Social Psychology, 103 (3), pp 506-519.