Puneet Varma (Editor)

Illusory truth effect

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

The illusory truth effect (also known as the truth effect, the illusion-of-truth effect, the reiteration effect, the validity effect, the availability cascade, and the frequency-validity relationship) is the tendency to believe information to be correct after repeated exposure. One science writer has explained it as follows: "Why are so many people convinced that we only use 10% of our brains, or that Eskimos have n words for snow...?" The answer is the truth effect.

Contents

This phenomenon was first discovered in 1977 at Villanova University and Temple University. It has in recent years been equated by some researchers with the concept of "truthiness", a term coined by American comedian Stephen Colbert.

Initial study

The effect was first named and defined following the results in a study from 1977. On three occasions, Lynn Hasher, David Goldstein, and Thomas Toppino presented the same group of college students with lists of sixty plausible statements, some of them true and some of them false. The second list was distributed two weeks after the first, and the third two weeks after that. Twenty statements appeared on all three lists; the other forty items on each list were unique to that list. Participants were asked how confident they were of the truth or falsity of the statements, which concerned matters about which they were unlikely to know anything for certain. (For example, "The first air force base was launched in New Mexico." Or "Basketball became an Olympic discipline in 1925.") Specifically, the participants were asked to grade their belief in the truth of each statement on a scale of one to seven. While the participants' confidence in the truth of the non-repeated statements remained steady, their confidence in the truth of the repeated statements increased from the first to the second and second to third sessions, with an average score for those items rising from 4.2 to 4.6 to 4.7. The conclusion made by the researchers, who were from Villanova and Temple universities, was that repeating a statement makes it appear more likely to be factual.

In 1989, Hal R. Arkes, Catherine Hackett, and Larry Boehm essentially replicated the original study, with similar results, which was published in Europe's Journal of Psychology.

Processing fluency

At first, the truth effect was believed to occur only when individuals are highly uncertain about a given statement.

This assumption was challenged by the results of a 2015 study by Lisa K. Fazio, Nadia M. Brasier, B. Keith Payne, and Elizabeth J. Marsh. Published in the Journal of Experimental Psychology, the study suggested that the truth effect can have an impact on participants who actually knew the correct answer to begin with, but who were swayed to believe otherwise through the repetition of a falsehood. For example, when participants encountered on multiple occasions the statement "A sari is the name of the short plaid skirt worn by Scots," some of them were likely to come to believe it was true, even though these same people were able to correctly answer the question "What is the name of the short pleated skirt worn by Scots?"

After replicating these results in another experiment, Fazio and her team attributed this curious phenomenon to "processing fluency", a term that describes the facility with which people comprehend statements. "Repetition," explained the researcher, "makes statements easier to process (i.e. fluent) relative to new statements, leading people to the (sometimes) false conclusion that they are more truthful."

Hindsight bias

In a 1997 study, Ralph Hertwig, Gerd Gigerenzer, and Ulrich Hoffrage linked the truth effect to the phenomenon known as "hindsight bias", described as a situation in which the "recollection of confidence is systematically distorted after feedback about the actual truth or falsity has been received".

Although the truth effect has been demonstrated scientifically only in recent years, it is a phenomenon with which people have been familiar for millennia. One study notes that the Roman statesman Cato closed each of his speeches with a call to destroy Carthage ("Ceterum censeo Carthaginem esse delendam"), knowing that the repetition would breed agreement, and that Napoleon reportedly "said that there is only one figure in rhetoric of serious importance, namely, repetition", whereby a repeated affirmation fixes itself in the mind "in such a way that it is accepted in the end as a demonstrated truth". Others who have taken advantage of the truth effect have included Quintilian, Ronald Reagan, and Marcus Antonius in Shakespeare's Julius Caesar.

Hertwig, Gigerenzer, and Hoffrage have described the truth effect (which they call "the reiteration effect") as a subset of hindsight bias, and look forward to "a theoretical integration of findings in human confidence", including the truth effect and such other phenomena as overconfidence bias and the hard–easy effect.

Other studies

In a 1979 study, participants were told that repeated statements were no more likely to be true than unrepeated ones. Despite this warning, the participants perceived repeated statements as being truer than unrepeated ones.

Studies in 1981 and 1983 showed that information deriving from recent experience tends to be viewed as "more fluent and familiar" than new experience. A 2011 study by Jason D. Ozubko and Jonathan Fugelsang built on this finding by demonstrating that, generally speaking, information retrieved from memory is "more fluent or familiar than when it was first learned" and thus produces "an illusion of truth". The effect grew even more pronounced when statements were repeated twice and yet more pronounced when they were repeated four times. The researchers thus concluded that "memory retrieval is a powerful method for increasing the perceived validity of statements (and subsequent illusion of truth) and that the illusion of truth is a robust effect that can be observed even without directly polling the factual statements in question."

A 1992 study by Ian Maynard Begg, Ann Anas, and Suzanne Farinacci suggested that "a statement will seem true if it expresses information that feels familiar".

A 2012 experiment by Danielle C. Polage showed that some participants exposed to false news stories would go on to have false memories. The conclusion was that "repeating false claims will not only increase their believability but may also result in source monitoring errors".

In a 2014 study, Eryn J. Newman, Mevagh Sanson, Emily K. Miller, Adele Quigley-McBride, Jeffrey L. Foster, Daniel M. Bernstein, and Maryanne Garry asked participants to judge the truth of statements attributed to various people, some of whose names were easier to pronounce than others. Consistently, statements by persons with easily pronounced names were viewed as being more truthful than those with names that were harder to pronounce. The researchers' conclusion was that "subjective, tangential properties such as ease of processing can matter when people evaluate information attributed to a source".

Examples

The truth effect plays a significant role in various fields of activity. During election campaigns, false information about a candidate, if repeated in TV commercials, can cause the public to believe it. Similarly, advertising that repeats unfounded claims about a product may boost sales because some viewers may come to think that they heard the claims from an objective source.

Examples of the truth effect can be found everywhere. A kayaking expert has pointed out that it is an accepted fact that when kayaking on the ocean or the Great Lakes, one should use a kayak at least 16 feet long. But this is not true; the best length for a kayak depends on a variety of factors.

References

Illusory truth effect Wikipedia