Originally published March 21, 2013
In addition to being skeptical common claims associate with Corrective exercise, I also strongly encourage everyone (fitness pros, rehab pros and fitness enthusiasts) to be highly skeptical of the common claims associated with Complimentary and Alternative Medicine practices, as by definition “Alternative Medicine” means ” treatment interventions that have NOT been proven by (i.e. failed) scientific controlled trials.”
Note: When a given treatment intervention proves itself in scientific testing, it becomes “Medicine.” In other words, in reality, there really is no “alternative medicine”, there’s just medicine and there’s everything else.
Now, this brings us to the reasons that inspired me to write this Why Smart Trainers (and smart people in general) Believe Stupid Things series, which is…
Anytime I talk to smart Personal Trainers and Rehabilitation professionals about why they should be highly skeptical of the claims commonly associated with Motion Assessment Procedures, Corrective Exercise Interventions and Alternative Medicine Practices, they always come back with statements like:
“I’ve seen it work.” ”I don’t care what the science says, it works for me and it helps my clients/ patients”.
“I’m convinced acupuncture works because I know plenty of people who’ve used it to cure all kinds of stuff.”
“I know these corrective exercise techniques work because I see it all the time. That’s all the evidence I need.”
Sometimes statements like the above are offered as justifications for the person’s own beliefs; at other times they are designed to “convince” the listener of some important truth. In either case, these statements represent a strong conviction that a particular belief is warranted in light of the evidence presented. Unfortunately, such evidence is hardly sufficient to warrant such beliefs because social psychology research has proven (beyond a shadow of a doubt) that we’re all very bad at judging the evidence of our own experience due to imperfections in our capacities to process information and draw accurate conclusions.
Put simply, just because something works in your experience in no way means that it actually does work in reality. My goal with this “Why Smart Trainers Believe Stupid Things” series is to prove that to you by (systematically) increasing your understanding of how questionable beliefs and self-delusions are formed and how they are maintained. Along with shedding some light on the study of human judgment and reasoning (i.e. social psychology).
False beliefs plague both experienced professionals and less informed people alike.
”From the greatest scientist to the humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you’re in good company. No matter who your idols and mentors are, they too are prone to spurious speculation, erroneous beliefs and self-delusions.” David McRaney
Contrary to popular misconception, people do NOT hold questionable beliefs simply because they are stupid or gullible. As you’ll soon discover from this ”Why Smart Trainers Believe Stupid Things” series; false beliefs are NOT the products of irrationality, but of flawed rationality.
To kick off this unique series, we’re exploring the Bias Toward Positive Evidence.
The Bias Toward Positive Evidence
Put simply, the Bias Toward Positive Evidence is our innate tendency to “detect” relationships (between two variables) that are not there because we overvalue evidence that only confirms a given hypothesis.
Here’s the proof – Take this quick test: The Watson Selection Task
Imagine a table with four cards on it, marked “A,” “D,” “4,” and “7.” Each card has a letter on one side and number on the other. Your task is to determine whether all cards with a vowel on one side have an even number on the other.
Which two cards would you turn over? (I encourage you to take a moment to consider which cards should be turned over.)
As a game of logic, this should be a cinch for you to figure out. When psychologist Peter Watson conducted this experiment in 1977, fewer than 10% of the people he asked got the correct answer.
So what was your answer? If you choose the “A” card and choose to turn over the “4” card as well, you are among the 90% of people who’s minds get boggled by this task. That’s because these (the “A” and the “4″) are the cards that would only produce information consistent with the hypothesis you are supposed to be testing. But in fact, the cards you need to flip are the “A” and the “7,” because finding a vowel on the back of the “4” would tell you nothing about “all cards,” it would just confirm “some cards,” whereas finding of vowel on the back of “7″ would comprehensively disprove your hypothesis.
This modest brainteaser clearly demonstrates that you don’t always appreciate the distinction between necessary and sufficient evidence, and have the tendency to be overly impressed by data that, at best, only suggests that a belief may be true.
“Because people often fail to recognize that a particular belief rests on inadequate information, the beliefs enjoys an illusion of validity and is considered, not a matter of opinion or values, but a logical conclusion from objective evidence.” Prof. Thomas Gilovich
In other words, you have a willingness to base your conclusions on incomplete information, which makes you highly vulnerable to developing false beliefs.
It should also be noted, as Prof. Thomas Gilovich points out, “this experiment is particularly informative because it makes it abundantly clear that the tendency to seek out information consistent with a hypothesis need not stem from any desire (i.e. emotional attachment to a given training/ treatment method) for the hypothesis to be true. In this case, the people (and you) surely did not care whether all cards with vowels on one side and even numbers on the other; they sought information consistent with the hypothesis simply because it seemed to be the most relevant to the task at hand.”
The tendency (of our unchecked intuition) that positive instances are somehow more informative than disconfirmations can also been seen in the (below) quotation by John Holt:
“I was thinking of a number between 1 and 10,000. They still cling stubbornly to the idea that the only good answer is a ‘yes’ answer. If they say, ‘Is the number between 5,000 and 10,000?’ and I say yes, they cheer; if I say no, they groan, even though they get exactly the same amount of information in either case.”
The same bias in seeking out confirmatory information has been demonstrated in a number of investigations into the hypothesis-testing strategies people use in everyday social life.
In the most common procedure used in these social psychology experiments like this study, participants are asked to determine if someone is an “extrovert” by selecting a set of questions to ask the target from a list of questions provided by the experimenter. Much of this research shows that most of subjects asked questions for which a positive answer would confirm the hypothesis (i.e. “do you like going to parties?”) rather than refute it.
According to Thomas Gilovich, a professor of psychology at Cornell University, ”When trying to determine if a person is an extrovert, for example, people prefer to ask about the ways in which the target person is outgoing; when trying to determine if a person is introvert, people are more inclined to ask about the ways in which the target is socially inert.”
Professor Gilovich goes on to state that “Although a tendency to ask such one-sided questions does not guarantee that the hypothesis will be confirmed, it can produce an erroneous sense of confirmation for a couple of reasons. First, the specific questions asked can sometimes be so constraining that only information consistent with the hypothesis is likely to be elicited. For example, in one widely-cited study, one of the questions that the participants were fond of asking when trying to determine if a person was an extrovert was: “what would you do if you wanted to liven things up at a party?” A question such as this one is clearly biased against disconfirmation: even the most inner directed individual has been to a party or two and can at least discuss how to liven one up explicitly asked to do so. By asking such constraining questions, it is difficult for anyone, including introverts, not to sound extroverted.” – “Furthermore, even if such constraining questions are not asked, a tendency to ask confirmatory questions can still produce spurious sense of confirmation if the likelihood of a positive response to the question is high whether or not the hypothesis is true. Suppose, for example, that you want to determine if an individual is introverted, and so you ask about a characteristic that might confirm your hypothesis: “do you sometimes feel that it is hard for you to really let yourself go at a party?” The person’s response is unlikely to be truly informative because most people, extroverts as well as introverts, would answer the same way – “yes, sometimes it is hard to really let go.”
Wait! There’s more…
We show a similar tendency to seek out hypothesis-confirming evidence when we interrogate information from our own memories for relevant evidence.
In this study, subjects first read a story about a women who exemplified various introverted and extroverted behaviors and then were divided into two groups. One group was asked to consider the woman’s suitability for a job as a librarian (a job thought to demand introversion), while the other group was asked to consider her suitability for a job as a real estate agent (a job thought to demand extroversion). As part of their assessment, the participants were asked to recall examples of the woman’s introversion and extroversion. The particular job the woman was seeking strongly affected the evidence that the participants could recall: those asked to assess the woman’s suitability for an extroverted job recalled more examples of the woman’s extroversion, while the group considering her for the librarian job cited more examples of the woman’s introversion.
The Take Away Lessons for Fitness Professionals and Rehab Professionals:
– We have the tendency to draw firm (complete) conclusions from incomplete information because we seek out and overvalue confirmatory information for any given hypothesis.
– If 90% of people fail to understand the evidence required to truly prove the hypothesis in The Watson Selection Task, which gets the same results every time the experiment is performed, then it’s highly likely that 90% of fitness professionals and rehabilitation professionals are basing their beliefs about how well a given corrective exercise or treatment practice “works” on incomplete and insufficient evidence. This means that there’s a 90% chance that YOU are one of these individuals who’s currently being misled by the evidence of your own experience. And, that’s okay! Because only by becoming aware of the (proven) fallibility in our everyday reasoning – like the bias toward positive evidence along the other flaws in judgement, which I’ll cover in future installments of this WSTBST series – can these undeniable psychology facts be embraced and overcome.
– We do not adequately assess the validity of our hypotheses or beliefs because we do not fully utilize all of the information available to us. If we just seek to confirm that our chosen methods are working and neglect to attempt to disconfirm them (i.e. provide other explanations for why our clients/ patients saw improvements. Ex: Rest), any conclusions we make in regards to cause-and-effect of our chosen corrective exercise/ treatment methods rests on very shaky ground.
– The relationship one perceives between two variables (like a particular pain and an intervention method) can vary with the precise form of the question that is asked. We tend to ask our clients and patients “leading” questions that elicit information (i.e. an answer) that’s likely to confirm our hypothesis, often giving us an erroneous sense of confirmation for the need to use our chosen corrective exercise/ treatment methods.
– We tend to pay more attention to the ways in which the issues our clients and patients present with that are similar to (i.e. fit well within) our chosen corrective/ treatment methodologies than to the ways in which they differ. When testing a hypothesis of similarity, we look for evidence of similarity rather than dissimilarity, and when testing a hypothesis of dissimilarity, we do the opposite.
We’re Just Getting Started!
As I’ve just shown you, the bias toward positive evidence is one (of many) undeniable psychological reasons, which proves that just because “you’ve seen” a given corrective exercise or treatment intervention “work” “in your experience,” in no way guarantees that it actually does.
In each installment to this series – as I did with this one – I will address one of the many different cognitive illusions, failings of intuition, and inherent biases in the data upon which we base our beliefs, so you can recognize these psychological realities and overcome them in order to arrive at sound judgments and valid beliefs about training/ treatment practices.
Coach Nick Tumminello is the owner of Performance University International, which provides hybrid strength training and conditioning for athletes and professional educational programs for trainers and coaches all over the world.
As an educator, Coach Nick has become known as the Trainer of Trainers. He has presented at international fitness conferences in Iceland, China, and Canada. He has been a featured presenter at conferences held by such organizations as IDEA, NSCA, DCAC, and ECA, along with teaching staff trainings at fitness clubs throughout the United States. Nick holds workshops and mentorship programs in his hometown of Fort Lauderdale, Florida. He has produced more than 15 instructional DVDs and is a CEC provider for ACE and NASM.
Nick has been a fitness professional since 1998 and co-owned a private training center in Baltimore, Maryland, from 2001 to 2011. He has worked with a variety of exercise enthusiasts of all ages and fitness levels, including physique and performance athletes from the amateur to the professional ranks. From 2002 to 2011, Nick served as the strength and conditioning coach for the Ground Control MMA fight team and as is a consultant and expert for clothing and equipment companies such as Sorinex, Dynamax, Hylete, and Reebok.
Nick’s articles have appeared in over 30 major health and fitness magazines, includingMen’s Health, Men’s Fitness, Oxygen, Muscle Mag, Fitness RX, Sweat RX, Status, Train Hard Fight Easy, Fighters Only, and Fight! Nick is also a featured contributor to several popular fitness training websites. He has been featured in two New York Timesbest-selling exercise books, on the front page of Yahoo.com and Youtube.com, and in the ACE Personal Trainer Manual, Fourth Edition. Nick’s new book, Strength Training for Fat Loss, will be available March 25th 2014.