But here's the more concerning news. Our access to information, both good and bad, has only increased as our fingertips have gotten into the act. With computer keyboards and smartphones, we now have access to an Internet containing a vast store of information much bigger than any individual brain can carry - and that's not always a good thing.
Better access doesn't mean better information
This access to the Internet's far reaches should permit us to be smarter and better informed. People certainly assume it. A recent Yale study showed that Internet access causes people to hold inflated, illusory impressions of just how smart and well-informed they are.
But there's a twofold problem with the Internet that compromises its limitless promise.
First, just like our brains, it is receptive to misinformation. In fact, the World Economic Forum lists "massive digital misinformation" as a main threat to society. A survey of 50 "weight loss" websites found that only three provided sound diet advice. Another of roughly 150 YouTube videos about vaccination found that only half explicitly supported the procedure.
Rumor-mongers, politicians, vested interests, a sensationalizing media and people with intellectual axes to grind all inject false information into the Internet.
So do a lot of well-intentioned but misinformed people. In fact, a study published in the January 2016 Proceedings of National Academy of Science documented just how quickly dubious conspiracy theories spread across the Internet. Specifically, the researchers compared how quickly these rumors spread across Facebook relative to stories on scientific discoveries. Both conspiracy theories and scientific news spread quickly, with the majority of diffusion via Facebook for both types of stories happening within a day.
Making matters worse, misinformation is hard to distinguish from accurate fact. It often has the exact look and feel as the truth. In a series of studies Elanor Williams, Justin Kruger and I published in the Journal of Personality and Social Psychology in 2013, we asked students to solve problems in intuitive physics, logic and finance. Those who consistently relied on false facts or principles - and thus gave the exact same wrong answer to every problem - expressed just as much confidence in their conclusions as those who answered every single problem right.
For example, those who always thought a ball would continue to follow a curved path after rolling out of a bent tube (not true) were virtually as certain as people who knew the right answer (the ball follows a straight path).
Defend yourself
So, how so we separate Internet truth from the false?
First, don't assume misinformation is obviously distinguishable from true information. Be careful. If the matter is important, perhaps you can start your search with the Internet; just don't end there. Consult and consider other sources of authority. There is a reason why your doctor suffered medical school, why your financial advisor studied to gain that license.
Second, don't do what conspiracy theorists did in the Facebook study. They readily spread stories that already fit their worldview. As such, they practiced confirmation bias, giving credence to evidence supporting what they already believed. As a consequence, the conspiracy theories they endorsed burrowed themselves into like-minded Facebook communities who rarely questioned their authenticity.
Instead, be a skeptic. Psychological research shows that groups designating one or two of its members to play devil's advocates - questioning whatever conclusion the group is leaning toward - make for better-reasoned decisions of greater quality.
If no one else is around, it pays to be your own devil's advocate. Don't just believe what the Internet has to say; question it. Practice a disconfirmation bias. If you're looking up medical information about a health problem, don't stop at the first diagnosis that looks right. Search for alternative possibilities.
Seeking evidence to the contrary
In addition, look for ways in which that diagnosis might be wrong. Research shows that "considering the opposite" - actively asking how a conclusion might be wrong - is a valuable exercise for reducing unwarranted faith in a conclusion.
After all, you should listen to Mark Twain, who, according to a dozen different websites, warned us, "Be careful about reading health books. You may die of a misprint."
Wise words, except a little more investigation reveals more detailed and researched sources with evidence that it wasn't Mark Twain, but German physician Markus Herz who said them. I'm not surprised; in my Internet experience, I've learned to be wary of Twain quotes (Will Rogers, too). He was a brilliant wit, but he gets much too much credit for quotable quips.
Misinformation and true information often look awfully alike. The key to an informed life may not require gathering information as much as it does challenging the ideas you already have or have recently encountered. This may be an unpleasant task, and an unending one, but it is the best way to ensure that your brainy intellectual tapestry sports only true colors.
The Conversation
Debate on this article is now closed.
David Dunning is a professor of Psychology, University of Michigan.