(NB the text below can be interesting BUT it is old and need to be updated quite much – not yet done!)
Confirmation bias (see also http://culturalmedicine.se/svensk-del/besserwissrar-och-vi-ovriga/) is about selecting information based on your own subjective world map /perception of “reality” – usually without realizing (being aware) it.
The example of perception = https://sv.wikipedia.org/wiki/Konfirmeringsbias
Since we do not have precise knowledge and do not always critically review info and our inner-outer elaborations, confirmation bias constitutes both consciously and non-consciously varied parts of the science paradigm (Thomas Kuhn https://www.theguardian.com/science/2012/aug/19/thomas-kuhn-structure-scientific-revolutions and Karl Popper’s Three Worlds… link on ??!
Placebo and Nocebo can be seen as examples of (probably usually non-conscious) “confirmation bias” of “internal tacit knowledge”.
BUT – we need to match new information against, among other things, old already established spatial limbic memory clusters!!! Including confirmation bias. But how can we avoid unconscious confirmation bias?
The answer as I see it is: We should not avoid but clarify/make it clear/conscious when we do so by;
Be aware that our subjective world map (see ..), scientific we may be, is based on cultural internalizations/mirror neurons/… processes (Karl Popper’s world 2 and 3) = we have non-absolute knowledge without paradigmatic foundation that we have faith in but can constantly question calmly and rationally – critical rationalism in my opinion
Try to acquire superior, overall generalization methodology – try to understand what we don’t previously elaborate through reasoning and Thales principle http://stressmedcenter.com/vetenskap
Avoid being based on subjective values/opinions/preconceptions, etc.
Wait for the argument and not the opposite perspective – as, for example, Protagoras suggests. The above is not easy but requires training to really try to increase the understanding of complex processes where we are now also starting to be able to integrate human and artificial intelligence processing!
Below the support of link and comments …
Strong convictions can blind us to information that challenges them https://neurosciencenews.com/convictions-information-16467/
Summary: People fail to process information that contradicts their convictions. A new study explains the neural processes that contribute to confirmation bias
NOTE automatic translation below – not tidyed to
When people are very confident in a decision, they take in information that confirms their decision, but fail to process information that contradicts it, a UCL brain imaging study finds.
The study, published in Nature Communications, helps explain the neural processes that contribute to the confirmation bias rooted in most people’s thought processes.
Lead author, PhD student Max Rollwage (Wellcome Centre for Human Neuroimaging at UCL and Max Planck UCL Centre for Computational Psychiatry & Ageing Research) said: “We were interested in the cognitive and neural mechanisms that cause people to ignore information that contradicts their beliefs, a phenomenon known as confirmation bias. For example, climate skeptics may disregard scientific evidence suggesting that global warming exists.
“While psychologists have long known about this bias, the underlying mechanisms were not yet understood.
“Our study found that our brains become blind to opposing evidence when we’re very confident, which may explain why we don’t change our minds in light of new information.”
For the study, 75 participants completed a simple task: they had to assess whether a cloud of dots was moving to the left or right side of a computer screen. They then had to give a confidence rating (how confident they were in their response), on a sliding scale from 50% sure to 100% sure.
After this initial decision, they were shown moving dots again and asked to make a final decision. The information became even clearer the second time and could help participants change their minds if they had first made a mistake. But when people were sure of their original decision, they rarely used this new information to correct their mistakes.
25 of the participants were also asked to complete the experiment in a magnetoencephalography (MEG) brain scanner. The researchers monitored their brain activity as they processed the movements of the dots.
Based on this brain activity, the researchers evaluated the degree to which participants processed the newly presented information. When people weren’t very sure of their original choice, they correctly integrated the new evidence. But when participants were very confident in their original choice, their brains were virtually blind to information that contradicted their decision but remained sensitive to information that confirmed their choice.
The researchers say that in real-world scenarios where people are more motivated to stand by their faith, the effect may be even stronger.
The researchers monitored their brain activity as they processed the movements of the dots. The picture is public property.
Senior author Dr Steve Fleming (Wellcome Centre for Human Neuroimaging at UCL, Max Planck UCL Centre for Computational Psychiatry & Ageing Research and UCL Experimental Psychology) said: “Confirmation bias is often investigated in scenarios involving complex decisions on issues such as politics. But the complexity of such views makes it difficult to distinguish the various contributing factors to bias, such as wanting to maintain self-consistency with our friends or social group.
“By using simple perceptual data, we were able to minimize such motivating or social influences and pinpoint drivers of altered evidence processing that contribute to confirmation bias.”
In a previous, related study, the research team had found that people who hold radical political views — at either end of the political spectrum — aren’t as good as moderates at knowing when they’re wrong, even if something that doesn’t have to do with politics.
Because the neural pathways involved in making a perceptual decision are well understood in such simple tasks, this allows researchers to monitor the relevant brain processes involved. The researchers emphasize that an understanding of the mechanism that causes confirmation bias can help develop interventions that can reduce people’s blindness to contradictory information.
Max Rollwage added: “These findings are particularly exciting to me, as a detailed understanding of the neural mechanisms behind confirmation bias opens up opportunities for developing evidence-based interventions. For example, the role of mistaken trust in promoting confirmation bias shows that educating people to increase their self-awareness can help them make better decisions.”
About this neuroscience research article
Chris Lane as UCL
The picture is public property.
Original research: Open access
“Trust drives a neural confirmation bias”. by Max Rollwage, Alisa Loosen, Tobias U. Hauser, Rani Moran, Raymond J. Dolan & Stephen M. Fleming.
Nature Communication doi:10.1038/s41467-020-16278-6
Confidence/self-confidence drives a neural “confirmation/confirmation bias”
A prominent source of polarized and entrenched beliefs is confirmation bias, where evidence against one’s position is selectively ignored. This effect is most evident when the opposing parties are very confident in their decisions. Here we combine human magnetoencephalography (MEG) with behavioral and neural modeling to identify changes in post-decisional processing that contribute to the phenomenon of confirmation bias. We show that a high level of confidence in a decision leads to a striking modulation of neural processing after decision, so that the integration of corroborating evidence is reinforced while discfirmative evidence processing is abolished. We conclude that trust shapes a selective neural gating for choice-consistent information, reducing the likelihood of changes in the mind on the basis of new information. A central role in trust in shaping fidelity evidence accumulation shows that metacognitive interventions can help improve this pervasive cognitive bias.