Showing posts with label Stavridis. Show all posts
Showing posts with label Stavridis. Show all posts

Saturday, February 14, 2015

Seeking Truth

The last few posts I've penned for this forum (here and here) have danced around the edges--and occasionally jumped up and down on--the notion that we humans are flawed, cognitively compromised, and subject to some intrinsic constraints on our ability to see, understand, communicate, and act on the truth. Though this is not a new soapbox, I hadn't realized that this notion had taken over my writing and become as strident as it had. Then a good friend asked a simple question, and I found myself wrestling with the consequences of the human cognitive silliness on which I've been recently focused and what it means for truth in general and, perfectly apropos of this forum, truth in our analytic profession.

So, what poser did my wise friend propose? He offered three alternative positions based on the existence of truth and our ability to know it:
  1. There is a truth and we can grow to understand it.
  2. There is a truth and we cannot understand it.
  3. There is no truth for us to understand.
(Technically, I suppose there is a fourth possibility--that there is no truth and we can grow to understand it--but this isn't a particularly useful alternative to consider. As a mathematician and pedant by training and inclination, though, it is difficult to not at least acknowledge this.) 

The question is then where I fall on this list of possibilities. It's an important question, if for no other reason than where we sit is where we stand, and it becomes difficult to hypocritical to conscientiously pursue an analytic profession if we believe either two or three is the case. Strangely, though, I found this a harder question to answer than perhaps I should have, but here is where I landed:

At least with respect to the human physical and social universes with which we contend, there is an objective truth that is in some sense knowable and we, finite and flawed as we are, can discover these truths via observation, experimentation, and analysis.

In retrospect, my position on this question should have been obvious. I've been making statements that human cognition is biased and flawed, averring that this is a truth, and I believe it to be one. We can observe any number of truths in the way humans and the universe we occupy behave. I find, on refection, though that there is a limit to this idea. Specifically, we can probably never know with precision the underlying mechanisms that produce the truths we observe. We may know that cognitive biases exist and we may be able to describe their tendencies, but (speaking charitably) we are unlikely to ever have an incontrovertible cause-and-effect model to allow us to interact with and influence these tendencies in a push-button way.

So, the trouble I have with truth is that we apply truth value to the explanatory models we create. Since these models are artificial creations and not the systems themselves they must, by definition, fail to represent the system perfectly. Newtonian theories of gravity based on mass give way to relativistic theories of gravity based on energy. In some ways one is better than the other, but neither is true in a deep sense. Our models are never true in the larger sense. They may constitute the best available model. They may be "true enough" or " right in all the ways that matter." But both of these conditions are mutable and context-dependent. In a sense, I find myself intellectually drawn to the notion that truth in the contexts that matter to us professionally is an inductive question and not a deductive one.

In the end, I'm actually encouraged by this reflection, though the conclusion that models are and must be inherently flawed results in some serious consternation for this mathematician (soothed only by the clarity with which mathematicians state and evaluate our axiomatic models). I understand better what I'm seeking. I understand better the limitations involved. And, at the risk of beating a dead horse, I am more convinced of the need to put our ideas out in the world. This reflection might never have taken place if not for Admiral Stavridis and his injunction to read, think, and write.

Wednesday, January 28, 2015

Belief, Dissonance, and Difficulty in Analysis


RenĂ© Descartes
In a recent conversation on The Constant Strategist, an acquaintance offered the insightful observation that a lot of reading is really important, but a little reading is actually harmful as people may be taken with the belief that the one book they've read on a subject is the last word rather than what should be the first word on the subject. In this case, the particular subject was the importance of cultural and contextual understanding as an important (if not necessary) prerequisite for effective strategic engagement with another society or nation. But this thought led to a broader reflection on the theory of knowledge (or at least one aspect of the theory of knowledge), cognitive dissonance (with all its myriad side effects), and what these two things mean for analysts.

Baruch Spinoza
Several years ago, I first read a marvelous paper by Daniel Gilbert titled "How Mental Systems Believe" (that you can find online here). The gist of this article is a contrast between theories of learning described by René Descartes and Baruch Spinoza. In a nutshell, Descartes believed that one must first comprehend an idea before one can assess the truth of that idea. In other words, "comprehension precedes and is separate from assessment." Spinoza, on the other hand, dismissed the Cartesian distinction between comprehension and assessment, arguing "that to comprehend a proposition, a person had to implicitly accept that proposition." Only once an idea has been comprehended and believed is it possible to engage in a secondary and effortful process to either certify the idea as true or actively reject its truth. The evidence presented by Gilbert suggests that human beings are, for any number of reasons, Spinozan systems rather than Cartesian systems (or, at the very least are not Cartesian and may be some other type of system in which acceptance is psychologically prior to rejection).

This is all very interesting, but why does it matter? Perhaps the most important answer to that question is an oddity of human cognition commonly known as cognitive dissonance, the mental stress that results from holding "two or more contradictory beliefs, ideas, or values at the same time or" confronting "new information that conflicts with existing beliefs, ideas, or values." How do humans respond to cognitive dissonance? Robert Jervis has a good deal to say on the effects of cognitive dissonance in the milieu of international relations, noting that dissonance "will motivate the person to try to reduce dissonance and achieve consonance" and "in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance" so that "after making a decision, the person not only will downgrade or misinterpret discrepant information but will also avoid it and seek consonant information." This gives us such favorite phenomena as the Dunning-Kruger Effect (where the uninformed and unskilled rate their knowledge higher than is objectively accurate), the Backfire Effect (where in the face of contradictory evidence beliefs get stronger rather than weaker), and oh-so-many-more. So, if we must accept as true a concept if we are to understand it, as Spinoza indicates, subsequent rejection of the newly-learned concept is not just effortful but in some sense super-human. This means that reading more may not be useful, and for an inveterate reader this is disheartening. But ...

In another recent post
, I raised the idea (shamelessly copied from an analyst far more insightful than I) that the essence of the analysis profession is to understand things and explain them to others. If the very act of learning and understanding drives us to error, though, what are we to do? Does this mean we should throw up our hands and abandon the search for truth? Of course not. But when my acquaintance suggested that we read more, he was only half right. We must absolutely read and study more. But we must also:
  • Actively seek out positions different from our own. This includes red-teaming ourselves and exploring the results if each, every, and any combination of our assumptions are wrong. Since, by definition, each assumption we make must be necessary for planning or analysis, changes in those assumptions should change our analysis (else we would state them as facts and not assumptions) and generate a better view of the decision space.
  • Train our analysts (and ourselves) as early and as constantly as possible that the mental models we have of the world are themselves assumptions, and then refer to the previous point. This should go a long way toward mitigating the Law of the Instrument (when I have a hammer, problems seem to resemble nails).
  • Take nothing personally in the search for truth on which no one among us has a monopoly. 
There are probably other things to do, but this seems a good start. The natural question, then, is how we go about doing these things. One answer is simple, but (to shamelessly appeal to the authority of Dead Carl), "Everything is very simple in war, but the simplest thing is difficult." 

I submit that the first, second, and third steps in this journey of a thousand steps are, in the words of a favorite maxim from ADM Stavridis, to read, think, and write. Reading widely brings us new ideas, providing new positions, information, and perspectives (if we consciously seek non-confirmatory writings). Thinking is all about taking in the new information and new models, acting on the assumption that our own might be wrong, and looking for new and informative results. Writing facilitates both of these by putting our thoughts out in the world where they are subject to criticism from those not subject to our own biases, and it is these contradictory views we must learn to cherish since it it easy to find agreement (via confirmation biases if in no other way) but hard to find and use constructive disagreement. Public writing is a way (though not the only way) to find this input. (A disciplined red team can do so as well, as can other well meaning and trusted colleagues.) The final injunction, to take nothing personally, is important. This is an iterative process. If we read, think, and write we start on the right (write?) path, but if we then allow offense to drive use from the debate we will lose the gains we seek and must have.

Vincit omnia veritas ... but analytic truth is first a foe to conquer.

Sunday, March 24, 2013

Nail your whispers to the wall...

So, I've been suffering through Air War College recently (suffering less because of the material and more because of the timelines and integrity-stealing loss of soul to which the associated incentive structure for correspondence PME has driven me), and ran across a nice speech/article from Admiral James Stavridis.

Not so very long ago, ADM Stavridis addressed the Class of 2012 at the National Defense University.  (The remarks he offered that day appeared as a JFQ article titled Read, Think, Write: Keys to 21st Century Leadership; an alternative version appeared as Read, Think, Write, and Publish in the Proceedings of the US Naval Institute.)  The advice he offered that day was simple and encapsulated by three words: read, think, and write.  Stavridis claims,
The quintessential skill of an officer[or, methinks, an analyst] is to bring order out of chaos. You have to be calm, smart, and willing to do the brain work; in the end, 21st-century security is about brain-on-brain warfare. We will succeed not only because we have more resources, or because our values are the best, or because we have the best demographics or geographic advantages—all of those things matter, of course. But in today's turbulent security environment, we will succeed and defeat our enemies by out-thinking them. To do that, and to be successful senior officers, you need to read, think, and write.
He then tells us our "reading should include not only history, politics, diplomacy, economics, and so forth, but also great fiction, books from distant cultures, and perhaps even a little poetry."  (I would add math and science to this list, because that's the kind of guy I am, and I've become a frequenter of several blogs.)  This reading will, one hopes, help us to "think our way to success in incredibly complex scenarios."  This is actually how I use most of my reading in the blog-sphere, to put me on a path to thinking about a particular problem or set me on the trail of suggested related reading.

It is the last of his injunctions, however, that challenged me very directly and led to my inflicting this post on all of you.  Satvridis argues that after we read and think we must write, since "it is essential for communicating what we have learned, as well as allowing others to challenge our views and make them stronger."  (This recalls a favorite quote of mine from Proverbs: "Iron sharpeneth iron; so a man sharpens thecountenance of his friend.")  He encourages us to "share our ideas in print--a scholarly journal, a military magazine, or even a blog post." 

I have been remiss in this regard.  I read much, think much (at least I hope it is thought), and write far too little.  Mooch has offered us this forum in which friends may nail their whispers to the wall (borrowing a lovely phrase from the Admiral), and where each may sharpen the countenance of the others.  It's almost as if this blog sprang into being from the tears of Admiral Stavridis.  Be not afraid, but I plan to avail myself of this forum far more in the future.

Merf