Wednesday, January 28, 2015

Belief, Dissonance, and Difficulty in Analysis


René Descartes
In a recent conversation on The Constant Strategist, an acquaintance offered the insightful observation that a lot of reading is really important, but a little reading is actually harmful as people may be taken with the belief that the one book they've read on a subject is the last word rather than what should be the first word on the subject. In this case, the particular subject was the importance of cultural and contextual understanding as an important (if not necessary) prerequisite for effective strategic engagement with another society or nation. But this thought led to a broader reflection on the theory of knowledge (or at least one aspect of the theory of knowledge), cognitive dissonance (with all its myriad side effects), and what these two things mean for analysts.

Baruch Spinoza
Several years ago, I first read a marvelous paper by Daniel Gilbert titled "How Mental Systems Believe" (that you can find online here). The gist of this article is a contrast between theories of learning described by René Descartes and Baruch Spinoza. In a nutshell, Descartes believed that one must first comprehend an idea before one can assess the truth of that idea. In other words, "comprehension precedes and is separate from assessment." Spinoza, on the other hand, dismissed the Cartesian distinction between comprehension and assessment, arguing "that to comprehend a proposition, a person had to implicitly accept that proposition." Only once an idea has been comprehended and believed is it possible to engage in a secondary and effortful process to either certify the idea as true or actively reject its truth. The evidence presented by Gilbert suggests that human beings are, for any number of reasons, Spinozan systems rather than Cartesian systems (or, at the very least are not Cartesian and may be some other type of system in which acceptance is psychologically prior to rejection).

This is all very interesting, but why does it matter? Perhaps the most important answer to that question is an oddity of human cognition commonly known as cognitive dissonance, the mental stress that results from holding "two or more contradictory beliefs, ideas, or values at the same time or" confronting "new information that conflicts with existing beliefs, ideas, or values." How do humans respond to cognitive dissonance? Robert Jervis has a good deal to say on the effects of cognitive dissonance in the milieu of international relations, noting that dissonance "will motivate the person to try to reduce dissonance and achieve consonance" and "in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance" so that "after making a decision, the person not only will downgrade or misinterpret discrepant information but will also avoid it and seek consonant information." This gives us such favorite phenomena as the Dunning-Kruger Effect (where the uninformed and unskilled rate their knowledge higher than is objectively accurate), the Backfire Effect (where in the face of contradictory evidence beliefs get stronger rather than weaker), and oh-so-many-more. So, if we must accept as true a concept if we are to understand it, as Spinoza indicates, subsequent rejection of the newly-learned concept is not just effortful but in some sense super-human. This means that reading more may not be useful, and for an inveterate reader this is disheartening. But ...

In another recent post
, I raised the idea (shamelessly copied from an analyst far more insightful than I) that the essence of the analysis profession is to understand things and explain them to others. If the very act of learning and understanding drives us to error, though, what are we to do? Does this mean we should throw up our hands and abandon the search for truth? Of course not. But when my acquaintance suggested that we read more, he was only half right. We must absolutely read and study more. But we must also:
  • Actively seek out positions different from our own. This includes red-teaming ourselves and exploring the results if each, every, and any combination of our assumptions are wrong. Since, by definition, each assumption we make must be necessary for planning or analysis, changes in those assumptions should change our analysis (else we would state them as facts and not assumptions) and generate a better view of the decision space.
  • Train our analysts (and ourselves) as early and as constantly as possible that the mental models we have of the world are themselves assumptions, and then refer to the previous point. This should go a long way toward mitigating the Law of the Instrument (when I have a hammer, problems seem to resemble nails).
  • Take nothing personally in the search for truth on which no one among us has a monopoly. 
There are probably other things to do, but this seems a good start. The natural question, then, is how we go about doing these things. One answer is simple, but (to shamelessly appeal to the authority of Dead Carl), "Everything is very simple in war, but the simplest thing is difficult." 

I submit that the first, second, and third steps in this journey of a thousand steps are, in the words of a favorite maxim from ADM Stavridis, to read, think, and write. Reading widely brings us new ideas, providing new positions, information, and perspectives (if we consciously seek non-confirmatory writings). Thinking is all about taking in the new information and new models, acting on the assumption that our own might be wrong, and looking for new and informative results. Writing facilitates both of these by putting our thoughts out in the world where they are subject to criticism from those not subject to our own biases, and it is these contradictory views we must learn to cherish since it it easy to find agreement (via confirmation biases if in no other way) but hard to find and use constructive disagreement. Public writing is a way (though not the only way) to find this input. (A disciplined red team can do so as well, as can other well meaning and trusted colleagues.) The final injunction, to take nothing personally, is important. This is an iterative process. If we read, think, and write we start on the right (write?) path, but if we then allow offense to drive use from the debate we will lose the gains we seek and must have.

Vincit omnia veritas ... but analytic truth is first a foe to conquer.

2 comments:

  1. Indeed and thanks for these thoughts. Will have to read that Gilbert paper when I have the chance. Previous reading and experience finds me aligned with Spinoza's view. I find the hazard is that once one accepts a proposition, it may lead to greater comprehension, but more difficulty in abandoning that proposition despite all available evidence for the all the reasons you state and then some. All for reading, thinking and writing, just not sure it will solve these problems. Then again, if it doesn't nothing will so let us read, think and write.

    ReplyDelete
  2. @Merf- I've been studying Big Data problems recently and I can't help think about some parallels to this discussion. Sources of information, good and bad, are everywhere. The information is coming to us at high velocity, in high volume, and in many different varieties. The 3 V's of Big Data apply to us now more than ever. Personally I view the variety of the data as the biggest challenge to Big Data...technology has conquered the velocity and volume problem for now. But the unending variety of the data (type, format, structure, content) will plague programmers for years to come...and maybe it will never be solved. Interestingly, it's a very human trait to deal with variety...dissonance, ambiguity, etc, being chief among those elements of variety that will always cause computers to fail. But we don't deal with those problems without error. Is that the same "bias" we are observing in the human mind...induced by knowledge...too much of the wrong stuff, not enough of the right stuff. But as these problems are solved and algorithms improve will the gap be reduced until what remains is that fuzzy area of "truth" we are now groping for? (Break Break, I would like to propose the Merf/Mooch axiom...If you ever have the opportunity to use the word grope in a sentence, do so)

    ReplyDelete