Log In

View Full Version : Article: The Gospel According to Dixon #14: Onward Through the Fog!



Dixon
12-21-2012, 11:54 AM
by Dixon Wragg
WaccoBB.net


Column #14: Onward Through the Fog!


In my previous column (https://www.waccobb.net/forums/showthread.php?89817-Article-The-Gospel-According-to-Dixon-13-The-Tangled-Web-of-Lies) (1), we saw the pervasiveness of various kinds of lies in our lives, and I suggested that our habitual level of certainty should therefore be ratcheted way down. But, it gets worse! There are a number of other sources of "certainty" in our lives which, seen more clearly, warts and all, will inspire uncertainty.

https://img692.imageshack.us/img692/1606/weirdafricantraditionst.jpghttps://img195.imageshack.us/img195/5293/twelfthnighttraditionge.jpg

Tradition, for instance, tremendously informs human culture and individual values in every society. It could hardly be otherwise; it is the main mode whereby we transmit from one generation to the next the understandings and skills which allow survival and even progress. Without traditions, each generation would be starting from square one.

But traditions, as products of human thinking, are subject to fallacy—especially since most of them represent efforts to figure the world out from long before we developed systematic methods for doing so. They give us 20240not only useful truths, but also superstition, ethnocentrism, bigotries, and the closed-mindedness necessary to "keep the faith" when the "faith" in question is irrational. Religious traditions are especially problematical when they posit a supernatural source of absolute certainty, questioning of which is punishable by anything from shame to ostracization to death.

The reasonable response is not to rebel automatically against all tradition like some oppositional adolescent; knee-jerk rebellion just enslaves us in the opposite direction. Instead we must critique the claims of any tradition individually, accepting the wheat and rejecting the chaff, regardless of social pressures and emotional reactions in either direction. This is part of being a responsible adult. Tradition is never an acceptable excuse for brutality or injustice.

A related source of illusion is conformity. As social animals who need acceptance almost as much as we need 20241oxygen, we readily kid ourselves that the way those around us see the world (or seem to) is the truth. Most of this occurs unconsciously, so that false beliefs are foundational in our world-views, having never even passed through the conscious filter of "true or false?" scrutiny. So, for instance, the basic righteousness of our society and the corresponding presumed evil nature of those opposing us. When the belief is reinforced by the powerful psychosocial forces associated with what the critical thinking community refers to as "egocentric and sociocentric thinking", correcting it is nearly impossible.

Even when a belief passes through the filter of skeptical consciousness rather than being unconsciously absorbed from the social milieu, our human tendency to conform drives us. Consider internalized self-disgust in oppressed groups, the fervent embracing by many women of a secondary "helpmeet" role, the common acceptance of the "right" of royalty (whether of the literal or the Wall Street variety) to profit obscenely at our expense, or the self-suppression of dissent which can even lead to the accepted belief supplanting the dissenting one in the dissenter's mind. In my experience, questioning acceptable beliefs and proposing socially forbidden ones triggers extreme emotional distress and often unpleasant defense mechanisms. Thus safe illusion, including illusory certainty, triumphs over unpleasant truth.

Another source of appropriate uncertainty is the considerable degree to which our beliefs are based on logical fallacy. I've mentioned a few fallacies, such as the confirmation bias and acceptance of anecdotal evidence, in previous columns, but there are many more, and the bad news is that we are ALL subject to them (yes, this means YOU)! Being wrong is universally part of the human condition. The best we can hope for is to steadily increase our grasp of truth and decrease the degree to which we're living in a fantasy world. And that is why I'm writing this column on the basics of critical thinking.

Right about now, some of you may be starting to feel the creeping dread that accompanies an accurate understanding of the huge degree to which we're living in darkness. Perhaps you're saying to yourself "At least we can depend on experts!" Well—yes and no.

Yes, on average, experts know more about their area of expertise than do you and I. But, especially in some fields, they disagree as much as they agree, and a lot of what passes for expertise is about convincing the customer that the expert knows more than he or she really does, and with more certainty than is appropriate. Experts commonly conduct sloppy research, suppress disconfirming data and leap to unwarranted conclusions (2)—then sell the dubious results to us for top dollar.

George Bernard Shaw said it over a century ago: "...(T)he medical profession a conspiracy to hide its own shortcomings. No doubt the same may be said of all professions.
https://img842.imageshack.us/img842/9201/madscientistmuppet.jpgThey are all conspiracies against the laity; and I do not suggest that the medical conspiracy is either better or worse than the military conspiracy, the legal conspiracy, the sacerdotal conspiracy, the pedagogic conspiracy, the royal and aristocratic conspiracy, the literary and artistic conspiracy, and the innumerable industrial, commercial, and financial conspiracies...". (3 (https://www.online-literature.com/george_bernard_shaw/doctors-dilemma/0/))

Well then, what of the class of experts known as scientists? More than anyone else, seeking the truth is their stock in trade. They're well-trained in separating the wheat from the chaff and they spend astonishing amounts of time, energy and money doing just that. Indeed, science is the best system for discerning the nature of objective reality—but in practice it's way more fallible than the ideal.

The main problem with science is scientists. Being human, they're affected more than either they or we would like to think by common sources of distortion such as ego, conflict of interest, preconceptions, conformity, ethnocentrism, closed-mindedness, and the universal logical fallacies. Briefly, some examples:

The well-known Rorschach inkblot test has been one of the most widely used projective tests for the better part of a century, despite the fact that, especially in earlier years, there was essentially no evidence for its validity or reliability. Decade after decade passed with over two thousand research papers failing to support it, yet psychologists continued to make money administering and interpreting it, often stigmatizing patients with diagnostic labels based largely thereon. (4) With the advent of the Exner scoring method in the 1960s there was finally a bit of evidence that the Rorschach might have some limited validity when interpreted by that method, but it continues to be widely used for unvalidated uses, and many practitioners don't even use the Exner method.

https://img846.imageshack.us/img846/5986/200pxrorschachblot01.jpgIndeed, when I studied psychotherapy in graduate school, I discovered that most practitioners have little regard for scientific validation and even less knowledge about what constitutes good evidence. Instead they choose a school of psychotherapy with all of its attendant assumptions and techniques in much the same way people choose a religion—emotionally and randomly rather than rationally, and then apply it to everyone who walks through their door. Psychological "experts" continue to make money from the Rorschach and other dubious tests, teachings and techniques. The public pays.

The wildly profitable pharmaceutical industry provides some of the most glaring examples of financial conflict of interest poisoning the objectivity of research with life-or-death consequences. One overview of studies revealed that when "Big Pharma" companies test new drugs, the results are more than five times as likely to be positive than when independent researchers do the testing. Former New England Journal of Medicine editor Marcia Angell writes: "Consider three examples I’ve written about before: first, in a survey of 200 expert panels that issued practice guidelines, one third of the panel members acknowledged that they had some financial interest in the drugs they assessed. Second, in 2004, after the NIH National Cholesterol Education Program called for sharply lowering the acceptable levels of “bad” cholesterol, it was revealed that eight of nine members of the panel writing the recommendations had financial ties to the makers of cholesterol-lowering drugs. Third, of the 170 contributors to the most recent edition of the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders (DSM-IV), 95 had financial ties to drug companies, including all of the contributors to the sections on mood disorders and schizophrenia." (5 (https://bostonreview.net/BR35.3/angell.php)) Indeed, such conflict of interest is so pervasive in the pharmaceutical industry (among others) that it would be reasonable to say that it is typical.

The gold standard of scientific research is publication in a reputable peer reviewed or "refereed" scientific journal. "Peer review" in this context means that research must be scrutinized and endorsed by recognized 20242experts in the field of study before being accepted for publication. This is probably the best way to filter out the constant torrent of pseudoscience and sloppy science, but there's a growing body of data showing that the level of bias among journal referees is disturbingly high. The issue is well summarized by one who should know: Richard Horton, editor of the British medical journal The Lancet: "We portray peer review to the public as a quasi-sacred process that helps to make science our most objective truth teller. But we know that the system of peer review is biased, unjust, unaccountable, incomplete, easily fixed, often insulting, usually ignorant, occasionally foolish, and frequently wrong."(6 (https://www.mja.com.au/journal/2000/172/4/genetically-modified-food-consternation-confusion-and-crack))

On top of all that, even if we miraculously navigate past the shoals of ego, greed, and bias and somehow do science absolutely perfectly, which does happen, our results are often going to be wrong anyway! This is because science is about inferring general understandings from specific observations, and every once in awhile it will turn out that the specific examples observed were exceptional, rather than typical, leading to false conclusions when generalized.

For instance, suppose you wanted to empirically investigate the probability of getting heads when tossing a fair coin. So you toss the coin once and get heads. Generalizing from that one sample yields a probability of 100% for heads (and therefore 0% for tails), but you know better than to generalize from such a small sample, so you throw ten times, yielding six heads—a 60% probability for heads. Knowing that even ten trials is too few, you toss the coin 100 times, and now you have 53 tails and 47 heads. Will you conclude that the probability is 53% to 47% in favor of tails? Usually, the more trials, the closer the outcome will approximate the real probability of 50%. But the random nature of reality is such that it's possible to do lots of coin tosses (or observations of anything) and still get misleading results.

The more complex an experiment, the more factors that can vary randomly and the more likely that variance will yield the wrong answer. And, in a world wherein thousands of studies are being conducted at any given time, the number of false conclusions is dismayingly high. That's why research write-ups typically include a figure, called a "confidence level", representing the scientists' estimate of the likelihood that their conclusion is wrong. For instance, a 95% confidence level, which is typical, is basically saying "If we did everything exactly right, our conclusion has about a 5% (1 in 20) chance of being wrong."

But the confidence level is a very low estimate of the likelihood of being mistaken, because it assumes a perfectly designed, conducted and interpreted study, not taking into account distorting factors such as those I've mentioned. In fact, the conclusions of two out of three published medical studies turn out to be wrong (7) —and that's a better average than the vast majority of claims, which aren't held to the standards of medical journals!

Bottom line: even the best sources of knowledge are a lot more fallible than we like to think, so we all live within substantially illusory world-views in spite of our efforts to see clearly. This doesn't invalidate the effort to separate the wheat from the chaff, but it should inspire appropriate humility about our success at doing so.

Onward through the fog!https://img7.imageshack.us/img7/6231/80685526.jpg

NOTES

1. Dixon Wragg, The Gospel According to Dixon #13: The Tangled Web of Lies, https://www.waccobb.net/forums/showthread.php?89817-Article-The-Gospel-According-to-Dixon-13-The-Tangled-Web-of-Lies (Oct. 10, 2010).

2. David H. Voelker, "How Much Does Being Right Matter?" (review of Wrong: Why Experts Keep Failing Us—And How to Know When Not to Trust Them, by David H. Freedman), Skeptic, Vol. 16 No. 4, 2011, 62.

3. George Bernard Shaw, The Doctor's Dilemma: Preface on Doctors, The Literature Network, https://www.online-literature.com/george_bernard_shaw/doctors-dilemma/0/ (Oct. 10. 2010).

4. For this and other examples, see Robyn M. Dawes, House of Cards: Psychology and Psychotherapy Built on Myth (New York: The Free Press, 1994).

5. Marcia Angell, Big Pharma, Bad Medicine, Boston Review, https://bostonreview.net/BR35.3/angell.php (Oct. 10, 2010).

6. Richard Horton, Genetically modified food: consternation, confusion, and crack-up, The Medical Journal of Australia, https://www.mja.com.au/journal/2000/172/4/genetically-modified-food-consternation-confusion-and-crack (Oct. 10, 2010).

7. Voelker, 62. Of course, the accuracy of this number depends on the research that yielded it being correct!

8. Thanks to Oat Willie and Gilbert Shelton for the title.

https://www.waccobb.net/forums/waccobb/ImagesforMembers/DixonCroppedSmall.jpg
[I]About Dixon: I'm a hopeful monster, committed to laughter, love, and the Golden Rule. I see reason, applied with empathy, as the most important key to making a better world. I'm a lazy slob and a weirdo. I love cats, kids, quilts, fossils, tornadoes, comic books, unusual music, and too much else to mention. I’m a former conservative Christian, then New Ager, now a rationalist, skeptic and atheist. Lately I’m a Workshopping Editor at the Omnificent English Dictionary In Limerick Form (That’s right!), and my humor is gettingpublished in the Washington Post and Fantasy and Science Fiction. I’m job-hunting too, mostly in the Human Services realm. Passions: Too many -- Reading, writing, critical thinking, public speaking, human rights, sex and sensuality, arts and sciences, nature. Oh, and ladies, I’m single ;^D

Karl Frederick
12-21-2012, 08:12 PM
by Dixon Wragg
WaccoBB.net


Column #14: Onward Through the Fog!


. . . . . .
In fact, the conclusions of two out of three published medical studies turn out to be wrong (7) —and that's a better average than the vast majority of claims, which aren't held to the standards of medical journals!

Interesting thoughts Dixon . . . thanks for posting them. If more than 50% of the conclusions of published medical studies are wrong, what's biasing the conclusions? Are the errors are mostly in claimed efficacy of pharmaceuticals? If so, I'd suspect profit motives.

Karl

Dixon
12-22-2012, 10:12 PM
Good questions, Karl. I don't know the answers to them.

Interesting thoughts Dixon . . . thanks for posting them. If more than 50% of the conclusions of published medical studies are wrong, what's biasing the conclusions? Are the errors are mostly in claimed efficacy of pharmaceuticals? If so, I'd suspect profit motives.

Karl