Wednesday, January 28, 2015

Belief, Dissonance, and Difficulty in Analysis


René Descartes
In a recent conversation on The Constant Strategist, an acquaintance offered the insightful observation that a lot of reading is really important, but a little reading is actually harmful as people may be taken with the belief that the one book they've read on a subject is the last word rather than what should be the first word on the subject. In this case, the particular subject was the importance of cultural and contextual understanding as an important (if not necessary) prerequisite for effective strategic engagement with another society or nation. But this thought led to a broader reflection on the theory of knowledge (or at least one aspect of the theory of knowledge), cognitive dissonance (with all its myriad side effects), and what these two things mean for analysts.

Baruch Spinoza
Several years ago, I first read a marvelous paper by Daniel Gilbert titled "How Mental Systems Believe" (that you can find online here). The gist of this article is a contrast between theories of learning described by René Descartes and Baruch Spinoza. In a nutshell, Descartes believed that one must first comprehend an idea before one can assess the truth of that idea. In other words, "comprehension precedes and is separate from assessment." Spinoza, on the other hand, dismissed the Cartesian distinction between comprehension and assessment, arguing "that to comprehend a proposition, a person had to implicitly accept that proposition." Only once an idea has been comprehended and believed is it possible to engage in a secondary and effortful process to either certify the idea as true or actively reject its truth. The evidence presented by Gilbert suggests that human beings are, for any number of reasons, Spinozan systems rather than Cartesian systems (or, at the very least are not Cartesian and may be some other type of system in which acceptance is psychologically prior to rejection).

This is all very interesting, but why does it matter? Perhaps the most important answer to that question is an oddity of human cognition commonly known as cognitive dissonance, the mental stress that results from holding "two or more contradictory beliefs, ideas, or values at the same time or" confronting "new information that conflicts with existing beliefs, ideas, or values." How do humans respond to cognitive dissonance? Robert Jervis has a good deal to say on the effects of cognitive dissonance in the milieu of international relations, noting that dissonance "will motivate the person to try to reduce dissonance and achieve consonance" and "in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance" so that "after making a decision, the person not only will downgrade or misinterpret discrepant information but will also avoid it and seek consonant information." This gives us such favorite phenomena as the Dunning-Kruger Effect (where the uninformed and unskilled rate their knowledge higher than is objectively accurate), the Backfire Effect (where in the face of contradictory evidence beliefs get stronger rather than weaker), and oh-so-many-more. So, if we must accept as true a concept if we are to understand it, as Spinoza indicates, subsequent rejection of the newly-learned concept is not just effortful but in some sense super-human. This means that reading more may not be useful, and for an inveterate reader this is disheartening. But ...

In another recent post
, I raised the idea (shamelessly copied from an analyst far more insightful than I) that the essence of the analysis profession is to understand things and explain them to others. If the very act of learning and understanding drives us to error, though, what are we to do? Does this mean we should throw up our hands and abandon the search for truth? Of course not. But when my acquaintance suggested that we read more, he was only half right. We must absolutely read and study more. But we must also:
  • Actively seek out positions different from our own. This includes red-teaming ourselves and exploring the results if each, every, and any combination of our assumptions are wrong. Since, by definition, each assumption we make must be necessary for planning or analysis, changes in those assumptions should change our analysis (else we would state them as facts and not assumptions) and generate a better view of the decision space.
  • Train our analysts (and ourselves) as early and as constantly as possible that the mental models we have of the world are themselves assumptions, and then refer to the previous point. This should go a long way toward mitigating the Law of the Instrument (when I have a hammer, problems seem to resemble nails).
  • Take nothing personally in the search for truth on which no one among us has a monopoly. 
There are probably other things to do, but this seems a good start. The natural question, then, is how we go about doing these things. One answer is simple, but (to shamelessly appeal to the authority of Dead Carl), "Everything is very simple in war, but the simplest thing is difficult." 

I submit that the first, second, and third steps in this journey of a thousand steps are, in the words of a favorite maxim from ADM Stavridis, to read, think, and write. Reading widely brings us new ideas, providing new positions, information, and perspectives (if we consciously seek non-confirmatory writings). Thinking is all about taking in the new information and new models, acting on the assumption that our own might be wrong, and looking for new and informative results. Writing facilitates both of these by putting our thoughts out in the world where they are subject to criticism from those not subject to our own biases, and it is these contradictory views we must learn to cherish since it it easy to find agreement (via confirmation biases if in no other way) but hard to find and use constructive disagreement. Public writing is a way (though not the only way) to find this input. (A disciplined red team can do so as well, as can other well meaning and trusted colleagues.) The final injunction, to take nothing personally, is important. This is an iterative process. If we read, think, and write we start on the right (write?) path, but if we then allow offense to drive use from the debate we will lose the gains we seek and must have.

Vincit omnia veritas ... but analytic truth is first a foe to conquer.

Monday, January 19, 2015

Data Worship and Duty

If you spend more than a few minutes working as an analyst--operations, program, logistics, personnel, or otherwise--it is almost inevitable that some wise military soul will offer trenchant historical lessons about undue trust in analytics for decision making derived from the performance of Robert McNamara as Secretary of Defense. Too often, these criticisms are intended to deflect and deflate criticisms and conclusions of analysis without addressing the analysis itself (an ad hominem approach without so much of the hominem). But that doesn't mean there aren't common mistakes made in the conduct of analysis and worthwhile lessons to be learned from McNamara.


This short article from the MIT Technology Review is a bit old, but it also makes a number of useful points. The "body count" metric, for example, is a canonical case of making important what we can measure rather than measuring what's important (if what is important is usefully measurable at all). Is the number of enemy dead (even if we can count it accurately) an effective measure of progress in a war that is other than total? So, why collect and report it? And what second-order effects are induced by a metric like this one? What behavior do we incentivize by the metrics we choose, whether its mendacious reporting of battlefield performance in Vietnam or the tossing of unused car parts in the river? 

There's something more fundamental going on in the worship of data, though. We gather more and more detailed information on the performance of ours and our adversaries' systems and think that by adding decimals we add to our "understanding." Do we, though? In his Foundations of Science, Henri Poincaré writes:
If we could know exactly the laws of nature and the situation of the universe at the initial instant, we should be able to predict exactly the situation of this same universe at a subsequent interest. But even when the natural laws should have no further secret for us, we could know the initial situation only approximately. If that permits us to foresee the subsequent situation with the same degree of approximation, this is all we require, we say the phenomenon has been predicted, that it is ruled by laws. But this is not always the case; it may happen that slight differences in the initial conditions produce very great differences in the final phenomenon; a slight error in the former would make an enormous error in the latter. Prediction becomes impossible and we have the fortuitous phenomenon. 
Poincare is describing here what would later be dubbed the butterfly effect for nonlinear systems (with the comparison to predicting the weather made explicit in a later chapter). In systems such as these, chasing data is to pursue a unicorn and the end of the rainbow. Rather, it is structure we should chase. Modeling isn't about populating our tools with newer and better data (though this may be important, if secondary). Rather, modeling is about understanding the underlying relationships between the data.

We often hear or read that some General or other should have fought harder against the dictates of the McNamara Pentagon, but one wonders if perhaps such a fight is also the duty of a military analyst.

Thursday, January 8, 2015

Know your history ...

A friend who knows my leanings toward math and statistics -- and who understands my professional inclination to read, study, and apply them to military problems -- recently sent me a link to a wonderful article from The Economist, "They also served: How statisticians changed the war, and the war changed statistics."

Aside from the laudatory mention of George Box, whose assertion that "all models are wrong, but some are useful" has done more damage to the profession than any other single statement, this should be essential reading for the members of our smallish profession. Off the top of my head, I can think of at least two other works (other than the marvelous titles already described on this blog as "essential reading" and "books of interest") that should be part of our essential education as military analysts:

The Science of Bombing: Operational Research in RAF Bomber Command by Randall T. Wakelam. Much of our professional identity as a community comes from the mythology of operations (or Operational) Research and its application to the problem of civilizational survival in the Second World War. It seems a good idea to read the actual history of the people, techniques, politics, decisions, and decision makers involved in that history.
Thinking About America's Defense: An Analytical Memoir by Glenn A. Kent, David Ochmanek, Michael Spirtas, and Bruce R. Pirnie. Whether it's the mathematical techniques, the influence of political/historical context on problems of interest, or something more personal, this is an important work for military (especially Air Force) analysts.

There are more, of course. It's hard to tell what the next problem posed to a military analyst will be, so our educations must be necessarily broad. The study of mathematics, statistics, PPBE, doctrine, military history, international relations, leadership, management, theories of innovation, etc., are all important. In this case, though, the question is about the central and defining history of our communal story, our mythology.

What else should we read?

Monday, January 5, 2015

The Analytic Profession in One Tweet

Last week, a blog post by an Army strategist appeared on The Bridge (a marvelous blog that I highly recommend to military professionals of all stripes) that posed the following question:

"How would you define the art and science of our profession in one tweet?"

In this case "our profession" referred to the profession of arms, and the author put forward a compact solution with an attendant explication of his reasons that answered the challenge nicely. There is, I think, more than a little value in an effort like this one, and cutting away the chaff and getting to the heart of who we are, what we do, and why we do it is more than just an interesting intellectual exercise. If done well, it provides a clear and memorable vision that communicates to those on the outside what we do and to those on the inside why and how we do it (whatever "it" might be), in this way creating a professional community centered on the vision. This clarity of vision then has any number of second-order effects on prioritization, training, recruiting, etc., and the effort to create it can pay incredible dividends.

As a member of more than one professional community, though, this line of thinking led me to wonder, "How would you define the art and science of our military analytic profession in one tweet?" I frequently use the phrasing below when discussing the career field among the analysts with whom I work, though I can't claim credit for its composition. Those who know Mike Payne will recognize it and have likely been part of the ongoing conversation that led to it, but the words are his:

"Analysts learn how things work and explain it to others, usually in relation to other things and often quantitatively."

This definition (with 22 characters to spare) captures several critical characteristics of the analytic profession. 
  1. It is general. In many cases, we don't have the luxury to consider ourselves as ISR analysts, force structure analysts, operational assessment analysts, etc. Rather, our particular skills will be applied to whatever question is relevant to leadership. 
  2. Learning how things work is interesting as a standalone activity, but productive of nothing. Communicating the things we learn to those responsible for making decisions is a critical element of who we are as a community. 
  3. Not all analysis is quantitative. There are some tools available to the community of military operations research professionals (mathematical, simulation, etc.) that are in some sense unique, but these are not a sine qua non for analysis. Consider, for example, the analysis given by Graham Allison and Philip Zelikow in their iconic book Essence of Decision. Nary an equation is to be found, but it's difficult to dispute that they are seeking to understand a system and explain it to others who will make decisions. Analysis is something done by analysts, and it is independent of the tools used (except for tool between the analyst's ears). 
  4. The systems we study are not isolated, and understanding how they are coupled to other systems is vitally important, both to understand the constraints and restraints imposed by the environment and to illuminate non-proximate effects that may result from changes in the system under investigation. 
  5. It's worth noting what this definition does not do. The word "answer" does not appear, for example. Most questions of interest, do not have clean and precise answers, for example, or they have multiple answers that each have merits making them equally palatable but qualitatively different. So, it is generally not possible for analysis of an interesting problem to produce a single, incontrovertibly trues, and perfectly optimal answer. Thus, we explain to senior leaders how the system works to help them better understand the decision space before them, but we rarely provide answers and to chase these chimerae is ... problematic.
That's my 140-character contribution.
Merf

PS ... There's a fairly robust discussion of this question on Facebook among some of the participants in this thread. You can find it here.