As I have written before, the website Science Based Medicine is a daily must-read for anyone interested in developments in medicine, or simply in critical thinking. The five or six bloggers on the site have various specialties (one is even a lawyer), but they all bring rigorous evaluation to their very accessible writing. Most of the posts, though not all, are devoted to debunking so-called alternative treatments, but they are always open-minded and evidence-based.
One recent post was a book review, discussing Jonathan Howard's Cognitive Errors and Diagnostics Mistakes: A Case-Based Guide to Critical Thinking in Medicine, which is well worth reading. As the reviewer, Harriet Hall, explains, cognition errors are common in medicine, as they are in every other profession. Let me add that many of the thought patterns identified by Dr. Howard, via Dr. Hall's review, will be immediately recognized by lawyers: we regularly see them in judges, jurors, witnesses, and ourselves -- although the latter instantiations are the most difficult to detect, or acknowledge, or especially to remedy.
The book is 558 pages long, covering six major areas, as described by Hall:
Errors of overattachment to a particular diagnosis;
Errors due to failure to consider alternative diagnoses;
Errors due to inheriting someone else's thinking;
Errors in prevalence perception or estimation;
Errors invovling patient characteristics or presentation context;
Errors associated with physician affect, personality, or decision style.
It does not take much imagination to see how each of these errors can be manifested in law practice, especially but not exclusively litigation. Many of them apply to law teaching as well.
Hall also includes what she calls "a smattering of examples" of Howard's observations. Here is a smattering of the smattering:
Everything happens for a reason except when it doesn’t. But even then you can in hindsight fabricate a reason that will satisfy your belief system.
The backfire effect: “encountering contradictory information can have the paradoxical effect of strengthening our initial belief rather than causing us to question it."
Motivated reasoning: People who “know” they have chronic Lyme disease will fail to believe 10 negative Lyme tests in a row and then believe the 11th test if it is positive.
Apophenia: the tendency to perceive meaningful patterns with random information, like seeing the face on Mars.
The availability heuristic and the frequency illusion: “Clinicians should be aware that their experience is distorted by recent or memorable [cases], the experiences of their colleagues, and the news.” He repeats Mark Crislip’s aphorism that the three most dangerous words in medicine are “in my experience.”
There is much more specifically about medicine, including a lengthy discussion of so-called Complementary and Alternative Medicine (CAM) and the power of anecdotes and stories, but the greatest value of the book, to me, anyhow, appears to be its discussion of cognition errors. Looking at the effect of cognition errors in medical treatment, a setting with which everyone has some experience, helps us understand how they also affect our own practices.
I have asked our library to order the book, which costs $74.15 in paperback and $59.99 on Kindle. Medical books apparently cost as much as many law books, but this one definitely looks to be worth reading (especially if it can be borrowed from your library).
Excellent post. Of course, the field today that most needs to become aware of cognitive biases is politics. Cognitive biases are largely responsible for the current shutdown of the government. As the shutdown goes on longer and longer, each side becomes more entrenched in their position. Because of cognitive biases, each side is unable to see the problems with their position and that their opponents have some valid concerns. Especially troubling is the demonization by both sides. Is the other side really that bad, or are cognitive biases making them seem that way?
Posted by: Scott Fruehwald | January 17, 2019 at 12:48 PM