|
Mental maps We all use mental maps to navigate our world. We have a geographic mental map, our memory of places and routes, that helps us get where we need to go. And we have a societal, or behavioural mental map, our personal blueprint of who is who and how to get things done. Geography is objective and mostly constant: things are where they are. Occasionally new routes or buildings appear. These are easily assimilated into our existing map with a few small changes. We don’t lose sleep over them. Our view of people, procedures and society, on the other hand, is subjective and constantly being challenged. Everyone’s experiences are different and every day brings new ones that we must incorporate into our maps to help us predict how best to handle tomorrow. Assimilate or accommodate? As with geography, we assimilate minor changes easily. When we learn who to ask to order a reagent, or some new detail about a protocol, these are not sources of anguish. The map is recognisably the same, just with a minor tweak. But some new experiences are so far-reaching they challenge our fundamental beliefs and upset us. When someone we trusted exploits us, presenting our idea as their own, or gossiping about something we told them in confidence, it shatters our trust in them and may even force us to review our trust in others too. Our first experience of a grant or paper rejection can throw us into turmoil, bedevilled by the question of whether it was us, or the decision process that failed. And each time we experience overwhelm we are forced to stop and question how it happened and what we could change to avoid it. Confusion and doubt
These are double hits: not only do we have to deal with the setback itself but it throws us into mental confusion about why we got it so wrong and what else we might be misinterpreting. The mismatch between our beliefs and experience is known as cognitive dissonance. It feels bad, even if we don’t acknowledge it consciously, because it suggests something about our mental map is fundamentally wrong. A minor tweak won’t resolve it this time: it needs completely redrawing to accommodate this new experience. Only then can we decide how to move forward. But does it have to feel bad? What if we can understand and accept the process and see it as a path to growth?
0 Comments
“You look like the world is against you”, remarked a more mature, and remarkably perceptive colleague one day when I was a (very naive!) postdoc. “It is!”, I replied grumpily, and I truly meant it. But that evening I looked back and rethought. The simple fact that these words have stayed in my mind ever since shows how important they were. We all have occasional days when we feel this way, but in that particular period of my life a number of important things had not gone my way, most of them outside work. Somehow I’d got locked into this unhelpful way of thinking and it was this comment that shook me out of it. All of us have more potential than our progress suggests. We sense it in our bones and look for explanations. Some of these lie within us, and some outside, but because of self-serving bias the outside ones are so much easier to see and more comfortable to think about. So we attribute explanations wrongly, exaggerating the roles of external events in our adversity. This makes us feel ‘better’ in the short-term, but comes at a serious long-term expense. On days when we feel overwhelmed and everything falls apart, it’s hard to stop ourselves looking around for someone to pin ‘fault’ on: the previous user of the equipment we find broken, the administrator who hasn’t dealt with our request yet, the person who trained us but missed out one crucial detail, the reviewer who misunderstood our paper, the senders of our hundred unanswered emails – we have a long list to choose from.
Our chosen scapegoats may not be perfect, but neither are we! We are, after all, the one who tried to cram an unrealistic amount into our day, despite years of evidence that no day ever goes quite to plan! To expect other people to attend to our needs instantly and perfectly, when they may also be overburdened or under the weather is simply unrealistic. And if a reviewer misunderstands our paper, doesn’t this mean some readers might also struggle to follow it? The need for us to clarify the wording is important feedback. But in our mind, it’s their fault, pure and simple! Attribution errors sit alongside misunderstanding of emotions and heuristics in human thinking flaws, and scientists are no different. What roles does this play in making our research lives more difficult than they need to be? How can we get a better grip on reality to avoid this? And how do we avoid going too far the other way, blaming ourselves for everything? Here’s a particularly perceptive comment that has stuck in my mind, from a fellow cyclist on a group ride one Sunday morning: “Have you noticed the wind is either against you or you’re feeling really good today?”. If a ride feels easy, it’s never because there is a howling tailwind blowing you along, you’ve just miraculously become super-fit. A deep comment indeed for a blurry-eyed Sunday morning! Of course as scientists we would never do this, would we? We would never see a funded grant as a sign that we wrote a good project but question the judgement of the committee if our next grant application is rejected. And if we arrive late for a meeting, we would never blame the traffic that held us up, rather than the fact that we left it too late starting our journey there. Of course not! Self-serving bias is everywhere, and this includes scientists – all of us! We don’t usually see it, at least not in ourselves, because like everyone we have a ‘bias blindspot’. Only ‘other people’ are irrational, not us! Self-serving bias has a close cousin: in-group bias. We may tell ourselves our group does better science than our competitors, that our generation has dealt with more hardship than others, that ‘all’ people in other job types (administrators, journal editors, funders, support staff, etc.) are not as committed as we are, or are less reasonable people.
As supposedly rational scientists, shouldn’t we make sure we have all the information we need before we form these beliefs? What makes us think we have sufficient knowledge of those other people’s lives, or indeed that we have an objective view of our own life? But every day we find ourselves slipping into these ways of thinking. We are human after all, and these are common thinking errors that occur over and over in human thinking. But here’s the big question: how much anguish do we create for ourselves, and for our colleagues, by thinking this way? We know how it goes – a day spent reacting to other people’s emails or “quick chats” while our own to do list remains untouched at the end. Or losing ourselves in the first item on our list so other, often more important tasks don’t get a look in.
Yet another day of not living up to our potential but we know it’s there inside us. Sports psychologist and coach Tim Galwey summed this up with the equation:
Interference, in turn, is made up of external and internal factors. We often prefer to blame the external ones, remaining blind to the many ways we indulge in self-sabotage. So, what else is going on inside us that holds us back? The heuristics, or thinking short cuts, described in Tversky and Kahneman’s landmark 1974 article play important roles here, especially when we are unaware of how much we use them. The previous article describes how one of them, representativeness, plays out in a scientific workplace and this one covers the other two: availability and anchoring. Could greater awareness of these make our research lives a little more efficient and less hard to handle? Nobel Prize winner Daniel Kahneman and his colleague Amos Tversky changed how psychologists, economists and politicians think about how we think. In 1974, they published a landmark article on flaws in human decision making, titled ‘Judgement under uncertainty: heuristics and biases’. Using familiar examples from everyday life, they communicated what psychologists had recognised for decades and quantified it with some simple but ingenious experiments. Their work helped spark the rise of the self-help industry and, along with their later work on loss aversion, led to the Nobel Prize for economics for Kahneman (after Tversky’s death). He also disseminated these findings in many easily accessible forms, such as his books ‘Thinking, fast and slow’ and ‘Noise’ and many podcast interviews. Crucially, they warned, “Experienced researchers are also prone to the same biases”. Indeed, Kahneman describes how they developed their hypotheses by looking at, and laughing at, the flaws in their own thinking, an inspirational example of how humility strengthens research.
So, half a Century on, what can we learn from Kahneman and Tversky about the flaws in our own thinking? By becoming more aware of them, can we strengthen not only our research but also our wellbeing as we carry out that research? Can we use more critical thinking about our own thinking? When did you last compare yourself with someone else? Five minutes ago? Ten? Be honest! If you think you’ve not done it today, did you just wake up or did you not notice? Now think about how you design a good experiment. If you work with patients, you would never compare questionnaire-based data from one group with biomarker measurements of another, would you? Or in cell culture experiments, would you compare micrographs from the test condition with western blots from controls? Of course not! No scientist would ever base conclusions on two unmatched datasets, would we? Except when we do! When we compare our own presentation with the next speaker’s, this is exactly what we do, comparing our feelings with their outward appearance. We cannot see how we appear to others and we know next to nothing about their feelings. We also do it when we compare our worth as a scientist with the person who just landed the job we applied for. We consider our latest rejection along with the battle scars of all our previous ones, while knowing nothing about their rejected applications which could even outnumber ours. Negativity bias adds yet more fuel to the fire. So much for matched datasets and logical thinking! We often do equally illogical things when something goes wrong. When an experiment fails, for example, we can quickly find ourselves asking which reagent is off, whether the equipment is faulty, or even looking for ways to pin blame on someone else, perhaps even on the weather. Then later we find out we’d misread the protocol. When a reviewer criticises our paper, we imagine we know their personality and intentions despite having no idea who it is and no facial expressions to guide us. Why do we delude ourselves like this? This is our everyday life as researchers, getting ourselves into all kinds of misunderstandings and confusion: black-and-white thinking, confusing opinion with fact, misattributing causality, confirmation bias, undue negativity and more. We see it in others all the time but we are blind to it in ourselves. Why do we do this to ourselves? How much easier would our life be with more clarity in our thinking? And can we at least move a little in that direction? |
AuthorProfessor Michael Coleman (University of Cambridge) Neuroscientist and Academic Coach: discovering stuff and improving research culture Archives
November 2025
Categories
All
|