|
We know how it goes – a day spent reacting to other people’s emails or “quick chats” while our own to do list remains untouched at the end. Or losing ourselves in the first item on our list so other, often more important tasks don’t get a look in. Yet another day of not living up to our potential but we know it’s there inside us. Sports psychologist and coach Tim Galwey summed this up with the equation:
Interference, in turn, is made up of external and internal factors. We often prefer to blame the external ones, remaining blind to the many ways we indulge in self-sabotage. So, what else is going on inside us that holds us back? The heuristics, or thinking short cuts, described in Tversky and Kahneman’s landmark 1974 article play important roles here, especially when we are unaware of how much we use them. The previous article describes how one of them, representativeness, plays out in a scientific workplace and this one covers the other two: availability and anchoring. Could greater awareness of these make our research lives a little more efficient and less hard to handle? Availability The availability heuristic, Kahneman and Tversky explain, is our tendency to base our thoughts around the information most readily available to us at the time, while overlooking our wider experience. We are heavily influenced, for example, by the recency of an experience. If our experiment just failed or our grant was rejected, we can begin catastrophising especially if we are tired or run down. On the worst days we may even slide into: “I’d better find another career!”. Perspective is the antidote to this. Pausing to consider the wider picture can play a huge role in smoothing out the unavoidable ups and downs of research life. How did we feel last week when we got an interesting result? Or when we got the offer letter for our current job or studentship? Whatever challenge we are facing, how did we overcome a similar one previously? How big a problem is this anyway in the big scheme of our lives? The availability heuristic may contribute to imposter syndrome in several ways. Our own painful memories of struggles, rejections and self-doubt are far more known to us than those of other people, so they play a far greater role in our awareness. We are hard wired to compare ourselves with others but much of the time we only notice their successes. So, we base our comparison on totally unequal datasets: our fears and doubts and their achievements. What an absurd thing for a scientist to do but we’ve all been there! Even among other people, we tend to pay more attention to outstanding successes than to those who are struggling just like us. It’s so easy to get the impression we are in a minority being left behind but a more careful look around us often reveals a large majority of our peers who are struggling and striving in ways just like ourselves. Negativity bias This problem is further enhanced by the fact that negative emotions and memories are often more easily accessible to us than positive ones. This is probably a reflection of how we evolved (more on negativity bias in a future post). So we often remember paper and grant rejections more strongly than acceptances or awards, fuelling an expectation of rejection as we write a new one, even at an unconscious level that may manifest as procrastination. Similarly, we base conclusions about research culture on a few bad actors while overlooking the large majority good colleagues around us. The screen in front of us We see the availability heuristic at work in electronic communications too. Screens just centimetres from our face increasingly grab our attention in ways that a real life colleague across the room does not. We often feel compelled to react to an email instead of staying proactive about our own list of priorities for the day: “Our email inbox is a great task list of other people’s priorities” says leadership expert Conor Neill, and the immediacy of the screen in front of our eyes plays a part in channelling us down this spiral. Meanwhile the algorithms of the distraction economy intentionally predict and exploit what we will react to most, grabbing our attention, distracting us from the task at hand. Some social media platforms in particular do this by creating an illusion of divisiveness that blinds us to collaborative and supportive actions all around us that far outnumber the arguments they push at us. Anchoring The anchoring bias is our tendency to start with the first piece of information available to us, or the status quo. Kahneman and Tversky demonstrated its hold over us by showing that even prior exposure to a randomly generated number can influence how we think. It plays out in interviews where decisions are typically made in the first few minutes. We (or our interviewers) spend the rest of the time confirming that decision, and it takes something dramatic to change it such as a major faux pas or spark of genius. A future article will consider just how hard it is to change our minds and why. Similarly, if we think we’ve found the ‘perfect’ job it is particularly hard to remain open-minded to others who report a negative experience there, and few people actively seek them out. Once we become excited about it, confirmation bias is mostly in charge but we may live to regret it. Anchoring also stops us challenging current working models in our science nearly as much as would be useful. And we continue to accept workplace practices that have long outlived their usefulness, such as the extraordinary inefficiencies of scientific publishing in a world of instant and free communication. Other examples include sticking too rigidly to our original project when new opportunities present themselves. If Alexander Fleming had done this, the world would have waited much longer for antibiotics. We struggle to let go of our original hypothesis when the data argue against it. Similarly, we may stay in a job we don’t like for far too long because we have lost the wider perspective of what the alternatives could be. What can we do about it? As outlined in Series 1 and 2, so many of our problems come from misunderstanding our emotions. Emotions channel our limited supplies of energy and attention to specific areas of our lives so we can bring all our resources to dealing with challenging issues. This can be extremely useful, bringing us the motivation to get the job done, the joy and excitement when things go well, or sadness that forces us to slow down and reflect when they don’t. But our emotions evolved for a very different environment, not for the complexity of modern society with its level of unpredictable chaos. Chaos is scary. We don’t like to think how much randomness there is in our lives, and of how much we can neither understand nor predict it. We feel more secure if we believe we know more than we do. We easily slip into treating our heuristics as conclusions rather than hypotheses. When things don’t work out we resort to blaming “other people”, but while some people do let us down there is usually little we can do to change them. So we lock ourselves into a circle of confirmation bias and disappointment. The alternative is to accept the outcome as soon and as painlessly as we can, adjust our expectations – our models – and eventually raise our chance of success. For big decisions, Kahneman’s suggested deliberately exploring our thoughts on multiple days when we are in different moods (to ‘sleep on it’) so we can be more confident that these decisions work regardless of each day’s emotions. Similarly, we can minimise anchoring by deliberately recalling the facts in a different order. We can purposely look for perspective, celebrate our successes so we remember them better, and move tasks from our email inbox to our ‘to do’ list where they have to compete with our own priorities. We can have the humility to accept that we may be wrong, and perhaps even enjoy the learning opportunity so that we can now be ‘less wrong’. But to do so requires an environment of psychological safety, which explains why this is so import to learning. So, can we use objective, rational thinking to make better decisions and lower our anguish when things don’t turn out as hoped? Yes, to some degree. We all have days that are harder – when we are tired, overstretched or run down. But we can always try to recognise that our models are just hypotheses and that many of our plans have a high chance of being knocked off course by events beyond our control, but have the courage to go ahead regardless with our best guess of the right course of action.
And just like in research, occasionally luck will go our way.
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
AuthorProfessor Michael Coleman (University of Cambridge) Neuroscientist and Academic Coach: discovering stuff and improving research culture Archives
November 2025
Categories
All
|