Gilbert's thesis, grounded in social and cognitive psychological research, builds on the work of 17th-century philosopher Baruch Spinoza, who argued that acceptance of an idea is part of an automatic comprehension of it. This ran counter to the thinking of René Descartes, who held that we first understand an idea, and then subsequently decide whether to accept it or to reject it — a process most of us might like to imagine we go through when working out what we believe but which, according to Gilbert, is not borne out by the evidence.
"People are especially prone to accept as true the things they see and hear," Gilbert writes. "People are Spinozan systems that, faced with shortages of time, energy or conclusive evidence, may fail to unaccept the ideas that they involuntarily accept during comprehension."
Gilbert argues that "as perception construes objects, so cognition construes ideas", and thus initially "believes" them. "In both cases, the representation of a stimulus is believed — that is, empowered to guide behaviour as if it were true — prior to a rational analysis of the representation's accuracy."
If Gilbert is right, it is little wonder that, in an economy that depends upon flooding us with information reinforcing our pre-existing beliefs, we are finding that our resources are often too drained to make the effort of "unbelieving" what we are fed.
"The challenge in the digital age . . . is that people are generally disinclined to devote their overtaxed attention to anyone with whom they disagree, or any perspective that makes them in the least uncomfortable," says Romin Tafarodi, who co-authored another paper with Gilbert, published in 1990, that discussed the same ideas.
In the later essay, the author points out that "a system that believes its representations prior to assessing them . . . can only work if those representations are largely accurate".
We live in a world in which algorithms are deliberately set up to feed us content that tells us what we already think, and in which political campaigns intentionally flood our timelines with false and misleading information. It is crucial that we find a way of incentivising the effortful process of exposing ourselves to opposing viewpoints, and of sometimes "unbelieving" our own side.
- Financial Times