Showing posts with label Josh Peterson. Show all posts
Showing posts with label Josh Peterson. Show all posts

Sunday, April 13, 2008

IntelFusion Remixes Peterson's Table Of Analytic Confidence Assessment (IntelFusion)

Jeff Carr over at IntelFusion has added his own twist to Josh Peterson's recent work on analytic confidence. Jeff's remix is a good visualization (see below) of what Josh was getting at in his thesis. See the full post here and the IntelFusion main page here.

Thursday, April 3, 2008

Analytic Confidence Defined...Finally! (Original Research)

One of the great benefits of teaching at Mercyhurst is having the opportunity to work with a bunch of dedicated and very intelligent students. Another advantage is having enough students in the intelligence studies program to be able to conduct meaningful experiments.

Both factors come into play in the newly published thesis, Appropriate Factors To Consider When Assessing Analytic Confidence In Intelligence Analysis by one of our grad students, Josh Peterson (click here to download the full text. The full text will also be permanently available in the Mercyhurst Student Projects link list in the right hand column of this blog). Josh presented some of his research and findings at the recently completed ISA Conference but the full study is presented here for the first time.

I have written about the problems with the concept of analytic confidence before. Some analysts interpret analytic confidence psychologically (i.e. how the analyst feels about his or her analysis) while some use it estimatively (e.g. "I am confident that X will happen"). I have also argued that neither makes much sense in the context of modern best practices for intelligence analysis.

Analytic confidence really has to do with how well calibrated a particular estimate is. Both an experienced analyst and a beginner might say that "X is likely" but one would hope (vainly, perhaps) that the expert would have a tighter shot group around the target than the newbie.

Josh takes this as his starting point and asks, "What, then, are the relevant, legitimate elements of analytic confidence? What should analysts consider when they are asked to state how confident they are in their analysis?" Through a good bit of research (laid out in his lit review) he identified seven elements that seemed to legitimately underpin the concept of analytic confidence. They are: Source reliability, source corroboration, use of a structured method to do the analysis, analyst level of expertise, amount of collaboration between analysts, task complexity and time pressure.

Josh then created scenarios that were similar but, in one case, all of the elements mentioned above were extremely negative (low source reliability, high time pressure, etc) and, in another case, the elements were extremely positive (high source reliability, use of a structured method, etc.). Josh also established a third, control, group to help establish the validity of his experiment.

Josh used the students here at Mercyhurst as subjects for the studies. The students here get a good bit of real world experience in doing analysis both in our classes and in the internships and contract work we do so I think they are good proxies for entry-level analysts in the Intelligence Community.

Josh found that students could accurately identify high confidence from low confidence scenarios but that they were doing this largely through intuition. He suggested that we probably needed to update our curriculum in order to better teach our students those elements that should legitimately raise or lower an analyst's confidence in his or her work (suggestions we have already adopted).

Finally, Josh took his results and his research and combined them into a rubric (see below) that analysts can use to help score their confidence. Josh did not have time to test this rubric and the weighting in it represents his interpretation of the relative importance of each factor based on his read of the literature. Given how far he had already come, he wisely left these tasks to future researchers.



Analytic confidence is a tough nut to crack. It is hard to explain and even more difficult to research and test. Josh has taken a good first cut at it, though, and I think his work deserves some attention if only as a step upon which others can build.

Wednesday, March 26, 2008

Off To The International Studies Association Meeting!

I am eagerly awaiting my flight to the ISA Conference out in San Francisco (nothing like a 0500 departure followed by a long layover in Detroit before taking a 5 hour flight to the west coast...). My students get time off for good behavior and I get to present my paper on "A Wiki Is Like A Room..." (Saturday, 1545 if you are attending).

Two of my former students (and thesis advisees) will also be presenting. Both have done some excellent research that is well worth hearing about. Josh Peterson did a good bit of research to determine what were the appropriate elements of analytic confidence for intelligence and then ran an experiment to test his hypotheses. Rachel Kesselman did a multi-decade content analysis of the Key Judgments from dozens of NIEs to determine if there were significant changes in the ways the NIC has been articulating its intelligence judgments over time. Both papers are available in the paper archive at the ISA Conference but they really don't do the theses (or the research) justice. If you are interested in what you see in the papers or in their presentations (Thursday, 1545 for both), do not hesitate to contact them directly.

I will be blogging again once I get to the conference. Until then I will leave you with this Jonathan Coulton song that just barely begins to capture the unspeakable horror that is modern air travel, Skymall: