Friday, November 15, 2013

The Critical Thinker's Dictionary

The Critical Thinker's Dictionary: Biases, Fallacies, and Illusions and what you can do about them is now available from Amazon, Kobo, and Barnes & Noble as an e-book and from Lulu as a paperback. Click here for more information about ordering.




The Critical Thinker’s Dictionary grew out of a suggestion made by Harriet Hall, M.D., in a review of my book Unnatural Acts: Critical Thinking, Skepticism, and Science Exposed! Unnatural Acts. That book concluded with a chapter that advised the reader to study 59 cognitive biases, fallacies, and illusions that I briefly described. This blog was set up with the goal of posting expansions on those descriptions. So, every Monday for 59 weeks I tackled one of the biases, illusions, or fallacies and posted them here. Those posts have been rewritten and a few more topics have been added to produce The Critical Thinker’s Dictionary.
A guiding principle of Unnatural Acts and The Critical Thinker's Dictionary is that critical thinking does not come naturally. Not only must we work at becoming critical thinkers, doing so goes against our nature. Evolution has provided our species with a magnificent brain, capable of extraordinary things like self-consciousness, memory, facial recognition, and thousands of other “miracles.” But we evolved to think quickly, a necessity in the environments our species found itself during most of its 100,000-year history. There are times in our modern world where quick thinking is needed, but there are also many times when we should slow things down. Sometimes we are better off if, instead of relying on our instinctive, natural way of thinking about things, we take some time to do some research, to reflect, and to discuss before making a judgment.
 'Know Thyself' advised the ancient Greek sages at a time when philosophers defined us as rational animals. Rationality was thought of as an ideal largely achievable by controlling the emotions and avoiding logical fallacies. Today, we know better. Biology and neuroscience have exposed the brain as a great deceiver. Unconscious biases drive us to believe and do things that the conscious mind explains in self-serving stories, making us appear more rational to ourselves than we really are. Modern science has taught us that rationality involves much more than just controlling the emotions and avoiding fallacies. Today’s rational animal—what we call the critical thinker—must understand the unconscious biases that are directing many of our most important judgments and decisions. The Critical Thinker’s Dictionary explores the insights of ancient and modern philosophers along with the latest findings in such fields as neuroscience and behavioral economics to lay out the many obstacles and snares that await anyone committed to a rational life. The Critical Thinker’s Dictionary isn’t a collection of dry definitions, but a colorful, three-dimensional portrait of the major obstacles to critical thinking and what we can do to overcome them.

Monday, February 4, 2013

the wisdom of not thinking too much

This will be last blog post for Unnatural Acts that can improve your thinking. Instead of introducing another cognitive bias or logical fallacy, this final post will be devoted to considering when wisdom requires that we stop thinking altogether or that we stop gathering data to reflect on.

Monday, January 28, 2013

change blindness

Change blindness is the failure to detect non-trivial changes in the visual field. The failure to see things changing right before your eyes may seem like a design fault, but it is actually a sign of evolutionary efficiency.

Examples may be seen by clicking  here, here, here, here, and here.

The term 'change blindness' was introduced by Ronald Rensink in 1997, although research in this area had been going on for many years. Experiments have shown that dramatic changes in the visual field often go unnoticed whether they are brought in gradually, flickered in and out, or abruptly brought in and out at various time intervals. The implication seems to be that the brain requires few details for our visual representations; the brain doesn't store dozens of details to which it can compare changes (Simons and Levin: 1998). The brain is not a video recorder and it is not constantly processing all the sense data available to it but is inattentive to much of that data, at least on a conscious level.

Monday, January 21, 2013

bias blind spot

The bias blind spot was described by Princeton University psychologist Emily Pronin and her colleagues (2002) as the tendency to perceive cognitive and motivational biases much more in others than in oneself. The bias blind spot is a metabias since it refers to a pattern of inaccurate judgment in reasoning about cognitive biases.

Monday, January 14, 2013

suppressed evidence

A cogent argument presents all the relevant evidence. An argument that omits relevant evidence appears stronger and more cogent than it is.

The fallacy of suppressed evidence occurs when an arguer intentionally omits relevant data. This is a difficult fallacy to detect because we often have no way of knowing that we haven't been told the whole truth.

Many advertisements commit this fallacy. Ads inform us of a product's dangers only if required to do so by law. Ads never state that a competitor's product is equally good. The coal [*], asbestos [*], nuclear [*], and tobacco [*] industries have knowingly suppressed evidence regarding the health of their employees or the health hazards of their industries and products.

Monday, January 7, 2013

anecdotal evidence (testimonials)

Testimonials and anecdotes are used to support claims in many fields. Advertisers often rely on testimonials to persuade consumers of the effectiveness or value of their products or services. Others use anecdotes to drive home the horror of some alleged activity or the danger of widely-used electronic devices like cell phones. In the mid-90s, there were many people, some in law enforcement, claiming that Satanists were abducting and abusing children on a massive scale. The anecdotes involved vivid descriptions of horrible sexual abuse, even murder of innocent children. The anecdotes were quite convincing, especially when they were repeated on nationally televised programs with popular hosts like Geraldo Rivera. A four-year study in the early 1990s found the allegations of satanic ritual abuse to be without merit. Researchers investigated more than 12,000 accusations and surveyed more than 11,000 psychiatric, social service, and law enforcement personnel. The researchers could find no unequivocal evidence for a single case of satanic cult ritual abuse.

Monday, December 31, 2012

attribution biases

Human behavior can be understood as issuing from "internal" factors or personal characteristics--such as motives, intentions, or personality traits--and from "external" factors--such as the physical or social environment and other factors deemed out of one's personal control. Self-serving creatures that we are, we tend to attribute our own successes to our intelligence, knowledge, skill, perseverance, and other positive personal traits. Our failures are blamed on bad luck, sabotage by others, a lost lucky charm, and other such things. These attribution biases are referred to as the dispositional attribution bias and the situational attribution bias. They are applied in reverse when we try to explain the actions of others. Others succeed because they're lucky or have connections and they fail because they're stupid, wicked, or lazy.