• Blog
  • About the Author
  • Contact

Science Without Anguish
​Michael Coleman's Blog


Don’t believe everything you think!

9/22/2025

 
​When did you last compare yourself with someone else? Five minutes ago? Ten? Be honest! If you think you’ve not done it today, did you just wake up or did you not notice?
 
Now think about how you design a good experiment. If you work with patients, you would never compare questionnaire-based data from one group with biomarker measurements of another, would you? Or in cell culture experiments, would you compare micrographs from the test condition with western blots from controls? Of course not! No scientist would ever base conclusions on two unmatched datasets, would we?
 
Except when we do! When we compare our own presentation with the next speaker’s, this is exactly what we do, comparing our feelings with their outward appearance. We cannot see how we appear to others and we know next to nothing about their feelings.
 
We also do it when we compare our worth as a scientist with the person who just landed the job we applied for. We consider our latest rejection along with the battle scars of all our previous ones, while knowing nothing about their rejected applications which could even outnumber ours. Negativity bias adds yet more fuel to the fire. So much for matched datasets and logical thinking!
Picture

​We often do equally illogical things when something goes wrong. When an experiment fails, for example, we can quickly find ourselves asking which reagent is off, whether the equipment is faulty, or even looking for ways to pin blame on someone else, perhaps even on the weather. Then later we find out we’d misread the protocol. When a reviewer criticises our paper, we imagine we know their personality and intentions despite having no idea who it is and no facial expressions to guide us. Why do we delude ourselves like this?
 
This is our everyday life as researchers, getting ourselves into all kinds of misunderstandings and confusion: black-and-white thinking, confusing opinion with fact, misattributing causality, confirmation bias, undue negativity and more. We see it in others all the time but we are blind to it in ourselves.
 
Why do we do this to ourselves? How much easier would our life be with more clarity in our thinking? And can we at least move a little in that direction?

​Thinking traps and blindspots

“Man is not a rational animal; he is a rationalising animal”, Robert Heilein, ‘Assignment in Eternity’ (1953)
 
Our days flash by at an astonishing pace. Our heads contain amazing computing power but, like the AI trying to mimic us, it takes a lot of energy. The rational part of our brain needs even more energy and, to make matters worse, it is slow. No matter how clever we are, running through the infinite number of possible thoughts and actions about any given situation would quickly fry our brain. We have to narrow them down first.
 
We do this in two ways. First, we use short cuts, known as heuristics. These are rules of thumb based on generalisations from our past experience, which never was more than a partial reflection of reality and has since been corrupted by limitations of our memory. They work most of the time but they lack detail and are not good at handling exceptions.
 
Picture
​Second, our emotions channel us towards some types of reaction and away from others. If our rational mind is involved at all, some argue, it’s only in making the final decision from the shortlist our emotions offer us. 
 
But here’s the twist: we all have a deep, emotional need to believe we are a ‘good person’: self-serving bias. As civilised human beings, and scientists no less, we struggle with the idea that our emotions have such influence over our thoughts and actions, and that we simplify so much. So we invent supposedly rational explanations that we genuinely believe, tying our thinking in all kinds of knots to do so. Our bias blindspot make us believe only other people are irrational, but we are too. After all, to them we are ‘other people’! 

​Confirmation bias and cognitive dissonance

We get ourselves into even more scrapes when we confuse our simplified mental map of the world, our heuristics, with the real thing, known as naïve realism, and something unexpected happens that doesn’t fit with our map. For example, many of us start our career believing that ‘big’ journals publish the best science. Some still believe this, but many of us have seen plenty of exceptions. The first time we see a flawed paper in a ‘big’ journal it creates confusion. Our first instinct is to question our own understanding: “I must be missing something!”. Or we may continue to believe the paper even as evidence to the contrary slowly stacks up. Something deep down though doesn’t feel right, so we seek confirmatory evidence to reduce the discomfort. “That famous PI must be right. Look at all those other ‘big’ papers they published!”. Eventually, the dissonance becomes too much and we change our mind about the paper, and perhaps our level of trust in ‘big’ journals too. Through such processes we learn that what we believed to be our ‘knowledge’ is actually just our trust in someone else’s views. Sometimes, to advance our knowledge we have to question that trust, and this can be painful to do. Perhaps this is why we will tell ourselves all kinds of stories to dodge the discomfort of cognitive dissonance. 

Whose fault is it anyway?
In our experiments, we learn just how difficult it is to establish causality. But in everyday life, with all its complexities and unknowns, we think we know. We underestimate the role of luck and of our unconscious minds, along with the choices we have in how readily we accept, and how to interpret events beyond our control. “We control nothing, but influence everything”, says UCL Associate Professor and author Brian Klaas.
 
Consider a grant application. We control every word until hitting ‘submit’. The next steps are all down to luck: who reviews it, who presents it in the committee, their personality and mood on the day, which other proposals we are competing against, and how forcefully other committee members advocate for them, and yet many, especially in early career as I did, see this process as objective.  Once we are told about the outcome, we are again in control of how we deal with that but we don’t sense the tricks our unconscious minds play on us. We look for simple explanations for complex events, telling ourselves for example: “My grant was rejected because I’m a useless scientist” or: “The reviewers were biased”, when neither of these extremes is likely to be true.  
 
When looking at other people, however, we easily slip into thinking they have total control,  so anything that goes wrong their fault, the fundamental attribution error. For example, if our colleague turns up late for a meeting we assume they are disorganised or disrespectful, without considering whether they stopped on the way there to help someone in urgent need.

Series 3: the price we pay for heuristics and biases
Series 3 of Science Without Anguish explores all of these thinking flaws and simplifications, along with negativity bias, the curse of knowledge and ‘unconscious bias’ or implicit association, with examples of the problems we cause for ourselves and our colleagues if we don’t occasionally stop to challenge our own thinking. 
Picture

​


​Crucially, as illustrated at the beginning, being a scientist doesn’t stop us falling into these thinking traps. As human beings with needs and fears like everyone else, we are no different. But if we use that same questioning mind to think about our own psychology from time to time, this can help us. Physicists, says David McRaney, tell us there’s no chemistry without physics, and chemists say biology is nothing but complex chemistry. The reality though is that there is no physics, chemistry or biology without psychology.
 
Enjoy reading and please share with your colleagues!

Comments are closed.

    Author

    Professor Michael Coleman (University of Cambridge) Neuroscientist and Academic Coach: discovering stuff and improving research culture

    Illustrated by Dr Alice White (freelance science illustrator)

      Notify me when a new post is published

    Subscribe to Newsletter

    Archives

    November 2025
    October 2025
    September 2025
    December 2024
    November 2024
    October 2024
    September 2024
    June 2024
    May 2024
    April 2024

    Categories

    All
    Series 1
    Series 2
    Series 3

    View my profile on LinkedIn

    RSS Feed

Proudly powered by Weebly
  • Blog
  • About the Author
  • Contact