• In defense of bias

    If all you have is a hammer, everything looks like a nail.

    -Attributed to Bernard Baruch 

    Certain English words have gained a negative valence just in the last few generations. Some familiar and related examples include “disinterested” (which used to mean one had no dog in the hunt, and now means “bored by”), “prejudice”, “bias” and “discriminate”. Many of these have mutated from broad to narrow meanings, namely regarding attitudes toward social groups such as races or sexes. This has had the inadvertent effect of muddying the waters and causing confusion about the broader definitions.

    I shall note the comedy in the fact that to “discriminate” is considered a bad thing, but being “indiscriminate” is somehow also bad. This is just the muddying I mean. Discriminating against a member of a social group because of their group and no other reason is usually ethically wrong (but not always: social agencies designed to aid and support marginalized groups discriminate in the sense of not serving everyone). Discriminating isn’t always bad. A top 10 university will not accept the C- students, and why should they? “Bias” is much the same. What is bias?

    According to Wikipedia, bias is an inclination of temperament outlook to present or hold a partial perspective at the expense of (possibly equally valid) alternatives in reference to objects, people, or groups. Anything biased generally is one-sided and therefore lacks a neutral point of view. This all sounds pretty bad. There are specific types which certainly are always bad. Social discrimination, synonymous with prejudice, is wrong. Also statistical bias, where the sample of a scientific study is not representative of the population being measured, is always a bad idea.

    Blinded me with science
    Among academics, bias is a four-letter word. Allegations of bias are serious and if proven, damning. The problem here is that a lack of neutrality leads to thought errors and methodology errors. One might ignore obvious counter-hypotheses to a cherished one, for example. A biased researcher might be so sure of their conclusion that they fail to be stringent about their experiments or observations, internally justifying their sloppiness with “well, it makes no difference, the outcome would be the same whether the design were more stringent or not.” This is the sort of bias Niel deGrasse Tyson is talking about here.

    Tyson correctly identifies that science has machinery to remedy some of this sort of bias, namely peer review and replication. The problem here is how narrow the definition of bias is. The belief that peer review and replication are important is a bias. Read again the definition from above, Anything biased generally is one-sided and therefore lacks a neutral point of view. When it comes to the importance of scientific self-corrective machinery, you will find that scientists have quite a one-sided point of view: it’s important and necessary. Bias is good.

    Discriminating against the indiscriminate
    Every idea beyond arcane philosophical first principles requires other ideas to be believed (or at least entertained, but usually they are just believed outright). You might like to go for a walk today and you believe you can, but only if you also believe that your doorknob has not disintegrated overnight (you have a belief about metal and physics), that there is still gravity (belief about physics), that it is not 500 degrees outside (belief about climate/weather), that it is still lawful to take a walk (belief about society/law). You can’t even begin to think about a topic without invoking, perhaps nonconsciously, many beliefs which are part of your world-view. In science this is even more true. Scientists have a broad naturalistic paradigm full of implications about the world which can not all be independently verified (if any can!). Then they also tend to have research programs with more specific strictures and beliefs and on top of that they may have a theoretical model based on the research program which services the investigation of one or many specific hypotheses. Let’s say you study lemur foraging behaviors. You might have all of the following as part of your total paradigm:

    Scientific/naturalistic world view: The universe is composed of matter and energy, effects follow causes, and animal shaped bits of it obey the identified constants and laws.

    Research program: Evolutionary theory. Animals tend to evolve behaviors which make them better at producing heirs in a given environment.

    Theoretical model: Foraging primates will do much less nocturnal foraging when they share their home range with nocturnal predators.

    All of these are biases, although the third is likely rather weak as biases go. Unstated above are hundreds of beliefs that are assumed to be completely correct and so much so that they are left unsaid: All of the lemurs have DNA, even ones not yet tested; magical entities are not affecting observations; lemurs need to eat to live, they breath air and are composed of cells.

    Darwin believed that observed conditions today indicated things about the past, such as the shape of animals with respect to their environment. He believe in the old-earth account of geologists of his time. He believed, as all contemporary naturalists, that species somehow change over time. In other words, he had many biases. Things he took to be true to the exclusion of other possibilities which could not be definitively ruled out. He needed to be a chauvinist and if he wasn’t, he would never have come up with natural selection.

    This is not mere word-play. We need to be narrow in our thinking sometimes. We have to assume things we can’t ourselves prove, so that we can prove other things. You can’t put a person on the moon unless you chauvinistically believe F= MA is always true. You can’t cure many diseases without total acceptance of germ theory even if you’ve personally never done the experiments of Louis Pasteur or investigated alternates (black magic anyone?).

    In science, we use research programs and theoretical models as platforms that we can stand on and reach for new bits of understanding, new findings, perhaps new truths. Sometimes we find that bits are out of reach, and that we have to build a new platform which lets us grasp the same bits as the old one, but extends our reach farther in other directions as well. That’s great, but the thing is, you can’t reach up and reach down at the same time. To stand on any such platform, you better believe it will support you, at least for a time. If you don’t, you’ll never stand on it and reach for new ideas. Such belief, even if inabsolute, are biases. Science could not exist without them.

    Short list of Western biases I am glad for (not necessarily exclusive to the West)

    • Reason is better than superstition
    • Science is the best knowledge enterprise on Earth
    • Medicine is a science (not religion or magic)
    • All humans have moral worth, and many classes need special protections (children, the elderly, minorities, etc)
    • Animals should not unduly suffer at the hands of people
    • Democracy is better than other known forms of government, all citizens have the right to participate
    • Liberty is an important virtue

    How shall we sort good from bad?
    A few are always bad for ethical reasons. Any belief that makes you treat a person as an undifferentiated member of some perceived group is just plain wrong and harmful. Any belief that you or your group has moral superiority over any other group based on purely demographic distinctions is wrong and harmful. With respect to other kinds of ideas, here are a few tips:

    • Modulate the strength of your belief based on the weight of the evidence, and not necessarily what anyone else thinks the weight of the evidence is. In other words, it is safer to assume the world is spherical, and not as safe to assume one of the many climate models is totally accurate
    • Submit to the “outsider test” (See John Loftus’s book). Would you accept the idea on weight of evidence, if someone else suggested it?
    • Does it pay dividends? In science, a valid belief is more likely to lead to other valid findings. Conversely if you constantly have to apologize for it or summon excuses, do not ignore the fact.
    • Never tie your ego’s wagon to a belief that could change tomorrow. And they could pretty much all change.

    Lastly, commit to ideals instead of any specific belief, no matter how strongly supported or obvious it seems. Even Newtonian mechanics, which seemed so complete and unimpeachable, eventually fell. Beliefs come and go, but ideals never tarnish. If you find your strength there, you will not falter.

    Of course, that’s just my bias.

    Category: Critical Thinkingphilosophyskepticism

  • Article by: Edward Clint

    Ed Clint is an evolutionary psychologist, co-founder of Skeptic Ink, and USAF veteran.