Not dog related, but it explains the difficulty of changing minds on BSL.
As a rule, misinformed people do not change their minds once they have been presented with facts that challenge their beliefs. But beyond simply not changing their minds when they should, research shows that they are likely to become more attached to their mistaken beliefs. The factual information “backfires.” When people don’t agree with you, research suggests that bringing in facts to support your case might actually make them believe you less.
In other words, fighting the ill-informed with facts is like fighting a grease fire with water. It seems like it should work, but it’s actually going to make things worse.
To study this, Brendan Nyhan and Jason Reifler (2010) conducted a series of experiments. They had groups of participants read newspaper articles that included statements from politicians that supported some widespread piece of misinformation. Some of the participants read articles that included corrective information that immediately followed the inaccurate statement from the political figure, while others did not read articles containing corrective information at all.
Afterward, they were asked a series of questions about the article and their personal opinions about the issue. Nyhan and Reifler found that how people responded to the factual corrections in the articles they read varied systematically by how ideologically committed they already were to the beliefs that such facts supported. Among those who believed the popular misinformation in the first place, more information and actual facts challenging those beliefs did not cause a change of opinion—in fact, it often had the effect of strengthening those ideologically grounded beliefs.
It’s a sociological issue we ought to care about a great deal right now. How are we to correct misinformation if the very act of informing some people causes them to redouble their dedication to believing things that are not true?