• Blog
  • About the Author
  • Contact

Science Without Anguish
​Michael Coleman's Blog


I’m good, we’re good

11/3/2025

0 Comments

 
​Here’s a particularly perceptive comment that has stuck in my mind, from a fellow cyclist on a group ride one Sunday morning: “Have you noticed the wind is either against you or you’re feeling really good today?”. If a ride feels easy, it’s never because there is a howling tailwind blowing you along, you’ve just miraculously become super-fit. A deep comment indeed for a blurry-eyed Sunday morning!
 
Of course as scientists we would never do this, would we? We would never see a funded grant as a sign that we wrote a good project but question the judgement of the committee if our next grant application is rejected. And if we arrive late for a meeting, we would never blame the traffic that held us up, rather than the fact that we left it too late starting our journey there. Of course not!
 
Self-serving bias is everywhere, and this includes scientists – all of us! We don’t usually see it, at least not in ourselves, because like everyone we have a ‘bias blindspot’. Only ‘other people’ are irrational, not us!  
Picture
Self-serving bias has a close cousin: in-group bias. We may tell ourselves our group does better science than our competitors, that our generation has dealt with more hardship than others, that ‘all’ people in other job types (administrators, journal editors, funders, support staff, etc.) are not as committed as we are, or are less reasonable people.
 
As supposedly rational scientists, shouldn’t we make sure we have all the information we need before we form these beliefs? What makes us think we have sufficient knowledge of those other people’s lives, or indeed that we have an objective view of our own life? But every day we find ourselves slipping into these ways of thinking.
 
We are human after all, and these are common thinking errors that occur over and over in human thinking. But here’s the big question: how much anguish do we create for ourselves, and for our colleagues, by thinking this way?

Read More
0 Comments

The short cuts to lost potential

10/20/2025

0 Comments

 
We know how it goes – a day spent reacting to other people’s emails or “quick chats” while our own to do list remains untouched at the end. Or losing ourselves in the first item on our list so other, often more important tasks don’t get a look in.
 
Yet another day of not living up to our potential but we know it’s there inside us. Sports psychologist and coach Tim Galwey summed this up with the equation:
 
  • Performance   =   potential   -   interference
 
Interference, in turn, is made up of external and internal factors. We often prefer to blame the external ones, remaining blind to the many ways we indulge in self-sabotage.  So, what else is going on inside us that holds us back?
 
The heuristics, or thinking short cuts, described in Tversky and Kahneman’s landmark 1974 article play important roles here, especially when we are unaware of how much we use them. The previous article describes how one of them, representativeness, plays out in a scientific workplace and this one covers the other two: availability and anchoring. Could greater awareness of these make our research lives a little more efficient and less hard to handle?


Read More
0 Comments

Heuristics: making sense of a complex world

10/6/2025

0 Comments

 
Nobel Prize winner Daniel Kahneman and his colleague Amos Tversky changed how psychologists, economists and politicians think about how we think. In 1974, they published a landmark article on flaws in human decision making, titled ‘Judgement under uncertainty: heuristics and biases’. Using familiar examples from everyday life, they communicated what psychologists had recognised for decades and quantified it with some simple but ingenious experiments. Their work helped spark the rise of the self-help industry and, along with their later work on loss aversion, led to the Nobel Prize for economics for Kahneman (after Tversky’s death). He also disseminated these findings in many easily accessible forms, such as his books ‘Thinking, fast and slow’ and ‘Noise’ and many podcast interviews.   
Picture
​Crucially, they warned, “Experienced researchers are also prone to the same biases”. Indeed, Kahneman describes how they developed their hypotheses by looking at, and laughing at, the flaws in their own thinking, an inspirational example of how humility strengthens research.
 
So, half a Century on, what can we learn from Kahneman and Tversky about the flaws in our own thinking? By becoming more aware of them, can we strengthen not only our research but also our wellbeing as we carry out that research? Can we use more critical thinking about our own thinking? 

Read More
0 Comments

Don’t believe everything you think!

9/22/2025

 
​When did you last compare yourself with someone else? Five minutes ago? Ten? Be honest! If you think you’ve not done it today, did you just wake up or did you not notice?
 
Now think about how you design a good experiment. If you work with patients, you would never compare questionnaire-based data from one group with biomarker measurements of another, would you? Or in cell culture experiments, would you compare micrographs from the test condition with western blots from controls? Of course not! No scientist would ever base conclusions on two unmatched datasets, would we?
 
Except when we do! When we compare our own presentation with the next speaker’s, this is exactly what we do, comparing our feelings with their outward appearance. We cannot see how we appear to others and we know next to nothing about their feelings.
 
We also do it when we compare our worth as a scientist with the person who just landed the job we applied for. We consider our latest rejection along with the battle scars of all our previous ones, while knowing nothing about their rejected applications which could even outnumber ours. Negativity bias adds yet more fuel to the fire. So much for matched datasets and logical thinking!
Picture

​We often do equally illogical things when something goes wrong. When an experiment fails, for example, we can quickly find ourselves asking which reagent is off, whether the equipment is faulty, or even looking for ways to pin blame on someone else, perhaps even on the weather. Then later we find out we’d misread the protocol. When a reviewer criticises our paper, we imagine we know their personality and intentions despite having no idea who it is and no facial expressions to guide us. Why do we delude ourselves like this?
 
This is our everyday life as researchers, getting ourselves into all kinds of misunderstandings and confusion: black-and-white thinking, confusing opinion with fact, misattributing causality, confirmation bias, undue negativity and more. We see it in others all the time but we are blind to it in ourselves.
 
Why do we do this to ourselves? How much easier would our life be with more clarity in our thinking? And can we at least move a little in that direction?

Read More

    Author

    Professor Michael Coleman (University of Cambridge) Neuroscientist and Academic Coach: discovering stuff and improving research culture

    Illustrated by Dr Alice White (freelance science illustrator)

      Notify me when a new post is published

    Subscribe to Newsletter

    Archives

    November 2025
    October 2025
    September 2025
    December 2024
    November 2024
    October 2024
    September 2024
    June 2024
    May 2024
    April 2024

    Categories

    All
    Series 1
    Series 2
    Series 3

    View my profile on LinkedIn

    RSS Feed

Proudly powered by Weebly
  • Blog
  • About the Author
  • Contact