• “Critical Thinking, Science, & Pseudoscience” is now available!

    Lack & Rousseau - CTSP cover
     My latest book – Critical Thinking, Science, & Pseudoscience: Why You Can’t Trust Your Brain – is now officially released! Co-written with Jacques Rousseau (University of Cape Town philosopher extraordinaire), it is based largely around the critical thinking courses that Jacques and I have been teaching at our respective universities. The book is designed to teach the reader how to separate sense from nonsense, how to think critically about claims both large and small, and how to be a better consumer of information in general.
    Although it’s being mostly advertised towards the academic market, we have purposefully written it to be highly readable, entertaining, and great for anyone wanting to sharpen (or build from scratch) their critical faculties. From the back cover:

    This unique text for undergraduate courses teaches students to apply critical thinking skills across all academic disciplines by examining popular pseudoscientific claims through a multidisciplinary lens. Rather than merely focusing on critical thinking, the text incorporates the perspectives of psychology, biology, physics, medicine, and other disciplines to reinforce different categories of rational explanation. Accessible and engaging, it describes what critical thinking is, why it is important, and how to learn and apply skills that promote it. The text also examines why critical thinking can be difficult to engage in and explores the psychological and social reasons why people are drawn to and find credence in extraordinary claims.

    From alien abductions and psychic phenomena to strange creatures and unsupported alternative medical treatments, the text uses examples from a wide range of pseudoscientific fields and brings evidence from diverse disciplines to critically examine erroneous claims. Particularly timely is the text’s examination of how, using the narrative of today’s “culture wars,” religion and culture impact science. The authors focus on how the human brain, rife with natural biases, does not process information in a rational fashion, and the social factors that prevent individuals from gaining an unbiased, critical perspective on information. Authored by a psychologist and a philosopher who have extensive experience teaching and writing on critical thinking and skeptical inquiry, this work will help students to strengthen their skills in reasoning and debate, become intelligent consumers of research, and make well-informed choices as citizens.

    We have gotten some reviews in, and people certainly seem to like it. Michael Shermer (Publisher of Skeptic magazine, monthly columnist for Scientific American, and Presidential Fellow at Chapman University) has called it “the best collection of ideas on critical thinking and skepticism between two covers ever published” while Elizabeth Loftus wrote that it was “a valuable contribution to any reader who cares about truth.”
    The book is available in both paperback and ebook versions, and you can go ahead and order your copy now! To help whet your whistle (as it were), below is an excerpt from Chapter 5 of the book (and the source of our subtitle), Why We Can’t Trust Our Brains.

    The human brain is immensely complex, with an estimated 86 billion neurons and some 100 trillion synapses (Azevedo et al., 2009). It is amazingly resilient in the face of injury, able to reroute functions around damaged or purposefully removed areas to restore partial or full functioning. It can store an almost limitless amount of information for decades when healthy. However, even with these wondrous properties, the brain is quite easily fooled. This means that you need to seriously doubt… yourself!

    A few years ago, when several students and I (CWL) started a skeptical/freethought campus group, the Center for Inquiry sent a giant box of goodies. In it were materials to help the group in getting started: flyers, copies of Skeptical Inquirer and Free Inquiry, stickers, and more. One of the stickers was a play on the Tide detergent logo and slogan that said “DOUBT—For even your strongest beliefs!” I loved it (and, in full disclosure, kept that one and put it in my office, where it is to this day).

    Doubting is one of the hallmarks of an enlightened mind and key to being a good scientist and a good skeptic. It is not enough to only doubt others, which we all do. Instead, people must also be willing and able to doubt their own beliefs and convictions across all areas in life. Many people do not do this, though, and instead plow through their lives convinced that their beliefs and perceptions of the world are accurate and unimpaired. The purpose of this chapter is to teach you some of the many, many reasons why you cannot blindly trust your own brain and to prove that doubt (even of yourself) is not just a good thing, but a necessary thing.

    Many researchers in the psychological and other sciences have spent decades looking into specific ways that the brain can be fooled. This chapter will first focus on two broad types of ways that our brains predictably prevent us from making logical, rational decisions. First we will discuss cognitive biases and mental heuristics that we all have, and how those often cause us to perceive a world that conforms to our preconceived notions about how things should be, rather than how they are. Next we will look at how humans naturally misperceive and misevaluate the data we are exposed to, especially in the case of ambiguous information. The chapter will conclude by examining why we have the specific types of problems. In particular, we will look at the concept of bounded rationality, examining how our pattern recognition abilities and motivation to find reasons for why things occurring in the world around us has multiple amounts of benefits to our species, evolutionarily-speaking, while also resulting in particular problems with evaluating information. We will also examine how using mental shortcuts can help us be more cognitively efficient, but again results in biases when presented with new or inconsistent information.

    The Logically Illogical Brain

    Thanks to the hard work of hundreds of researchers over the past half century, we have developed an ever-increasing understanding of the myriad ways that humans do not act rationally or make optimal decisions even when presented with straight-forward information and data. Instead, we often act and think in an understandable but irrational nature – what we are calling “logically illogical.” Exploring the various ways in which we can make poor decisions can then, in turn, lead us to understand why doubt of yourself is crucial to becoming a critical thinker.

    Two of the largest factors influencing why you should doubt yourself frequently and thoroughly are cognitive biases and mental heuristics. Since Kahneman and Tversky’s (1972) landmark article on how we, as humans, make non-rational decisions, the research on these factors has grown immensely. Cognitive biases are predictable patterns of judgmental deviation that occur in specific situations, which can cause inaccurate interpretation or perception of information. In other words, cognitive biases are regularly occurring ways that our brain misinterprets evidence. They impact our ability to make accurate, logical, evidence-based decisions on a consistent basis. Recognizing cognitive biases helps us to realize that our intuitive understanding of the world is often (but not always) distorted in some way. Once you understand that you cannot always trust your own judgment, and therefore need to apply doubt to yourself, you are on your way to beginning to overcome some of them. Luckily for us, these are “predictably irrational” (Ariely, 2008) problems, and so, by becoming aware of them on a conscious level, we are able to combat their influence.

    Slightly different, heuristics are mental shortcuts or rules of thumb that significantly decrease the mental effort required to solve problems or make decisions (Kahneman, Slovic, and Tversky, 1982). Unfortunately, heuristics will often lead to an oversimplification of reality that can cause us to make systematic errors that can then become cognitive biases. It is critical to note that just because we make the mistakes outlined in this chapter (and boy do we make them), this does not mean that humans are terrible decision makers all the time. In fact, in many situations it is actually adaptive for us to use heuristics and mental shortcuts (something we will come back to at the end of the chapter). Indeed, ignoring information can actually improve one’s judgment in specific situations. In their review of a massive amount of literature, Gigerenzer and Gaissmaier (2011) make note of the main impacts of heuristics on decision-making.

    First is a positive benefit called the “less-is-more” effect. Sometimes, heuristics result in as or even more accurate results than involved strategies despite processing less information. Second is what’s called ecological rationality. This means that heuristics aren’t inherently good or bad, but instead that their accuracy depends on the environment they are used in. Over time, people (generally) learn what heuristics are most useful to them in their particular environment. Importantly, real-world decision making frequently uses heuristics not because they are necessarily the best way to make decisions, but because we can’t have access to all of the needed information to make a perfect decision. Heuristics allow us to take a mental shortcut and arrive at a “good enough” decision.

    Although there is not a “standard list” of cognitive biases and heuristics (and in fact quite literally hundreds have been identified), the ones described below are some of the most well-researched and common biases/heuristics we encounter as humans.

    • Confirmation bias
    • Belief perseverance
    • Hindsight bias
    • Representativeness heuristic
    • Availability heuristic
    • Anchoring and adjustment heuristics
    For those instructors interested in using this in their class, we have also constructed full lecture slides for the book and an instructor’s guide with sample assignments, recommended videos, and more. Feel free to let our publishers know if you’d like to be considered for an adoption copy.

    Category: FeaturedPersonalPseudoscienceScienceSecularismSkepticism

    Tags:

    Article by: Caleb Lack

    Caleb Lack is the author of "Great Plains Skeptic" on SIN, as well as a clinical psychologist, professor, and researcher. His website contains many more exciting details, visit it at www.caleblack.com