Intelligent Design Philosophy Science

Sabine Hossenfelder on avoiding cognitive biases

Spread the love

Lost in Math

Sabine Hossenfelder author of Lost in Math: How Beauty Leads Physics Astray, thinks that avoiding cognitive bias is much more important than most of the topics she discusses. She offers a must-see optical illusion:

Then adds,

If a research topic receives a lot of media coverage, or scientists hear a lot about it from their colleagues, those researchers who do not correct for attentional bias are likely to overrate the scientific relevance of the topic.

There are many other biases that affect scientific research. Take for example loss aversion. This is more commonly known as “throwing good money after bad”. It means that if we have invested time or money into something, we are reluctant to let go of it and continue to invest in it even if it no longer makes sense, because getting out would mean admitting to ourselves that we made a mistake. Loss aversion is one of the reasons scientists continue to work on research agendas that have long stopped being promising.

But the most problematic cognitive bias in science is social reinforcement, also known as group think…

Sabine Hossenfelder, “How Scientists Can Avoid Cognitive Bias” at BackRe(Action)

The biggest problem with group think, of course, is that only outsiders identify it easily and, well, they’re outsiders…

See also: Sabine Hossenfelder on why the anthropic principle is controversial. It’s controversial because it is sometimes used to support the idea of a multiverse. Otherwise, it should be common sense to assume that a venue in which we exist must feature conditions that allow for that. But the multiverse does not need logic, evidence, or science.

Follow UD News at Twitter!

One Reply to “Sabine Hossenfelder on avoiding cognitive biases

  1. 1
    polistra says:

    Sabine underrates the forcible nature of two biases.

    Most people are prone to groupthink or following the status leader, but tenure WEEDS OUT scientists who are not status-followers. The population is skewed toward followers. Peer review then enforces groupthink even more strongly among the selected ones.

    Amortizing or sunk cost bias is enforced (at least in US) by the budget system of government grants. If you have a million-dollar grant and you find after two months of research that your hypothesis is poor, you should quit…. but you can’t. You need to use up all of the million ON THIS PROJECT before the end of the grant period, or else you won’t get another grant… and the university won’t get its 40% overhead share of the next grant. Other countries have a more sensible system where each lab gets a fixed amount of money per year, regardless of projects or completion.

Leave a Reply