Seeing the Elephant
Improving Leader Visualization Skills through Simple War Games
Lt. Col. Richard A. McConnell, DM, U.S. Army, Retired
Lt. Col. Mark T. Gerges, PhD, U.S. Army, Retired
Download the PDF
It was six men of Indostan
To learning much inclined,
Who went to see the Elephant
(Though all of them were blind),
That each by observation
Might satisfy his mind.
The First approached the Elephant,
And happening to fall
Against his broad and sturdy side,
At once began to bawl:
“God bless me!—but the Elephant
Is very like a wall!”
The Second, feeling of the tusk,
Cried: “Ho!—what have we here
So very round and smooth and sharp?
To me ‘t is mighty clear
This wonder of an Elephant
Is very like a spear!”
—John Godfrey Saxe
The parable of the six blind men attempting to identify something unfamiliar is well known. As each touched part of the strange animal, each came away with a partial picture; and as the poem continues, another man sees the tail as a rope, or the leg as a tree, etc. The poem finally ends with, “Though each was partly in the right, / And all were in the wrong!”1 Military planners have a similar problem—each has only an incomplete knowledge of the entire problem, and only by comparing notes across the staff can they attain sufficiently thorough understanding to accurately complete the staff analysis for the commander. The challenge for the U.S. Army is how to train the required visualization skills to process collected information and how to habituate service members to share their results to build a complete operational picture.
While Command and General Staff College (CGSC) faculty members have wrestled with the challenge of how to best educate students to improve their visualization and description skills, they have hit upon a return to simple role-playing board games as a low-cost and highly effective means to repetitively improve students’ abilities. Examining the Center for Army Lessons Learned (CALL) publications from the past twenty years has revealed that implementing war-gaming as a training technique has been a systemic challenge during combat training center (CTC) rotations.2 This challenge manifested itself in three ways: players skipped the war-game step altogether; if planners skipped the war game, then the combined arms rehearsal turned into a war game; or staffs conducted war games that resembled a rehearsal in that they did not contain an action, reaction, counteraction methodology. As the faculty scanned the CALL publications for insights, an unrelated event in a single staff group caught their attention. In the fall of 2013, CGSC students who played a simple role-playing board game for a history class, in this case Kriegsspiel (War Game), did a much better job at the war-gaming step of the military decision-making process (MDMP) in the tactics class, in particular in their ability to see (describe) the friendly situation.
To support the history class on German Field Marshal Helmet von Moltke the Elder and the German General Staff, the simulations department ran Kriegsspiel. Within one staff group, five students volunteered to play. Within the next few weeks, their tactics instructor noticed that this group was especially effective in the war-gaming step of MDMP—above the normal year-to-year performance that he was accustomed to seeing in similarly constituted classes. After reflecting on this anomaly, the faculty began to pose some questions: Was there a correlation between playing a simple war game such as Kriegsspiel and an improvement to the war-gaming step in MDMP; and if there was a correlation between the two with only five of sixteen students playing, what might be the effect if all sixteen students played the game? These questions prompted the faculty to design an experiment to examine the types of thinking that supported planning and how that thinking might mesh most effectively with the planning process.
Origin of Kriegsspiel
The original Prussian Kriegsspiel dates back to the early nineteenth century. Two Prussian officers, Lt. Georg Leopold von Reiswitz and later his son Georg Heinrich Rudolf von Reiswitz, developed and improved the game that used a grid system and scale unit markers. The original game system was heavily dependent on rules and tables to calculate the combat results. After having the game demonstrated to him, Prussian Chief of Staff Karl von Müffling was impressed, exclaiming, “This is not a game! This is training for war!”3
Later, in the 1870s, a more flexible alternative game known as “free” Kriegsspiel was developed that allowed for an umpire who could use his own experience together with a simplified rule system to calculate the results. With not much more than two topographical maps and some unit markers, umpires could rapidly calculate the combat results, allowing for less down time and freer action on the map.4 The opposing players were placed in separate rooms, and the umpires moved back and forth between the rooms. Players could only see what they saw on an actual battlefield. If a commander placed himself on a hilltop, then his view of his own units as well as any enemy in range was increased. Likewise, a commander in a defile saw only those units in his immediate vicinity. Consequently, players were forced to deal with fragmentary information, to visualize what it meant, and then communicate their analysis of that information to fellow players and their commander. The game became so important for the Prussian, and later Imperial German, army, that every officer until 1918 played Kriegsspiel as part of their education.
The Experiment
Kriegsspiel itself is not critical; the value of war-gaming is not linked to any peculiar or unique features of any particular game. Instead, it is the overall board game concept that provides the player with ways to approach planning and problem solving. We chose Kriegsspiel for our experiment mainly because it was already readily available at CGSC and had low overhead cost in both set-up and time to play, usually three to four hours. But it could have been any similarly well-conceived war game. Our approach to selecting a war game also included another important consideration—Kriegsspiel was not on a computer. Computer games by their nature take much of the requirement for individual mental calculation out of a competitive game, which is deceptively very appealing. However, overreliance on automation to do the thinking can lessen the requirement to think through the various courses of action for something as simple as estimating how far a unit can move based upon the options of terrain, for example, and ultimately decrease the benefits derived from playing the game.
The design of the board game, with students studying a standard scale map and developing the ability to think through the effects of time, space, and terrain while trying to maintain an accurate picture of friendly and enemy forces based upon spotty and incomplete information, was key. Other board games can provide a similar stimulus as long as they provide a partial picture of the information. (One outgrowth from our experience is that the Directorate of Simulation Education at CGSC is now working on exportable and low-cost board games for use.)
In the fall of 2016, seeing the apparent connection between playing Kriegsspiel and improved performance in MDMP, members of the faculty decided to conduct a rigorous test to see if there was in fact a correlation between the two. Two sections of students were selected to participate, a total of 111 officers. The test group consisted of thirty-two students who played a simple role-playing war game prior to the war-gaming step of MDMP. The other seventy-nine were the control group, which underwent the tactics instruction without modification.
To observe and conduct the test, faculty members unassociated with teaching these students were designated as research observers. The team had two active duty (one military intelligence and the other armor) and six retired Army officers of varied backgrounds to provide a mix of experience as they observed the students. The “bottom line” result was that the test group that played Kriegsspiel outperformed the control group in four ways. First, the test group saw (visualized) themselves more clearly than the control group (this concept will be discussed in more detail below during examination of the impact of war-gaming on visualization). Second, the test group was able to make choices based on their visualization with a higher level of confidence than the control group. Third, the test groups’ war-gaming step of MDMP identified more threats and opportunities than the control group. And finally, the test group better incorporated their war-gaming discoveries into their plans.
In short, the test group with a single iteration of playing Kriegsspiel was more effective than the control group at “seeing the elephant.” For a further explanation of the testing methodology and results, the research report for this study has been published by the Association for Business Simulation and Experiential Learning (ABSEL).5
Defining Effective Planning
What type of thinking is required for effective planning? The faculty, before designing an experiment to measure the effects of war-gaming, designed a theoretical model based upon the reflective process we had observed. The model was called the Cognitive Planning Domains (see figure 1). Through the cognitive planning domains, the faculty hoped to more accurately describe the types of thinking in which planners engaged in order to produce complete and well-thought out plans. The three areas were labeled as the factory, the laboratory, and the art institute.6
In the factory, planners concern themselves with quickly synchronizing, integrating, and executing. In the laboratory, planners put on their white coats and begin mixing chemicals, concerning themselves with validity, relevance, and feasibility—this process takes more time than the factory. In the art institute, planners focus their efforts on using foresight, innovation, creativity, and imagination—this takes the longest of all cognitive domains. The area where these three cognitive domains intersect is called the confluence of the art of command and the science of control. By creating this theoretical model, the faculty members were able to gain understanding of a potential problem with the types of thinking that have dominated planning processes over the last ten to fifteen years.
The problem appears two-fold. First, because of the urgency that dominates many operations, planners have found themselves primarily in the factory—the realm of the directed course of action. Second, because planners have emphasized the factory, the skills required for the laboratory and the art institute seem to have atrophied. This second problem may have an adverse effect on military leaders’ ability to pass on the capability to balance the art of command and the science of control. Such concerns in the past have served as topics of great leader development discussions between generations of leaders within our military institution. One question that should be included in such future discussions is, “What is the purpose of our planning process?”
Faculty at CGSC often use this question as an informal poll among their students. Most junior leaders focus on outputs. For example, the purpose of MDMP is to create an order. However, if we acknowledge that many orders do not survive the first shot of combat, is that really the purpose of MDMP? Perhaps the purpose of our planning processes is to gain understanding so that—should our plans be overcome by events—the understanding gained by the staff becomes the basis for future actions. Such discussions among the faculty conducting the research also spurred a desire to gain a better understanding about how leaders acquire awareness about their operational environments.
What We Learned about Simple Games and Visualization
How do leaders gain understanding of their operational environment? And what tools do they need to accomplish this task? One of the most important tools employed to gain understanding of the operational environment is the course of action sketch. As part of the study, students in the two groups were asked to recall what they had seen on a course of action sketch after studying it for only sixty seconds.
Figures 2 and 3 are the two course-of-action sketches displayed to the study participants for sixty seconds each. After studying the sketches, the students were asked a series of questions about what the friendly and enemy forces had, and how they might interact. The test group that played Kriegsspiel statistically significantly outperformed the control group in recalling what they had seen, particularly in regard to the locations of friendly units—in other words, they saw themselves better.7 This finding was interesting when compared to the level of comfort the students felt when making visualization-based decisions.
Allowing for Individual Leadership Styles
Leaders arrive at decisions in ways unique to their personalities. Some leaders require evidence and analysis, and draw conclusions to make a decision. Others intuitively arrive at a decision based on their own education and experience. In order to measure how leaders make decisions based on visualization, the research included a game theory instrument to measure the participants’ comfort in making decisions grounded in visualization. For example, some leaders need a greater amount of certainty to become comfortable enough to make a decision.
Ranking that comfort level from one (low and with more ambiguous information) to six (requiring a high level of certainty and information) allowed the researchers to gain insight into the students’ thinking.
After students had studied the sketches for sixty seconds and completed the visualization quiz, they were asked to rank their comfort level with the choices they made using the game theory instrument, ranking their certainty from one to six (see figure 4). In this second part of the visualization quiz, members of the test group who played Kriegsspiel recorded that they believed that what they were being asked fell in the realm of common knowledge at a statistically higher level than members of the control group.
Although not statistically significant, the test group outperformed the control group to a notable confidence level that they had enough information, certainty, rationality, and the knowledge that was common to them in part one of the quiz. When viewed through the lens of test versus control groups, the results were interesting in that the test group was more certain of their answers (see table).
How to Measure the Effectiveness of the War Game
The faculty counted discoveries of seizing opportunities and addressing threats and the ease with which each group incorporated these discoveries into their plans as measures of performance. To a statistically significant extent, the test group outperformed the non-Kriegsspiel control group by more readily seizing opportunities and addressing threats while integrating those discoveries into their plans.
These findings may not be surprising to many Army leaders, yet are significant because they appear to validate a traditional war-gaming methodology that had fallen in to disuse over time. By playing a simple role-playing board game such as Kriegsspiel for one four-hour iteration, planners improved their ability to see themselves, felt more comfortable making visualization choices, identified threats and opportunities, and incorporated the discovery into plans. All of these discoveries were statistically significant and surprised the faculty members conducting the study. We did not expect the results to be this pronounced, and it led us to offer some observations about the planning process as employed by the military professionals.
What We Learned about Our Planning Process
As posed earlier, it is important for senior leaders to engage their subordinates about what the planning process is and what it does. If the purpose is to gain understanding that might be employed throughout the planning process but also through execution, then our beliefs about how our planning processes are designed is important. Such discussions might be made clearer by reflecting on the fluid nature of combat and thus the need for adaptive thinking during planning and execution cyclically. Such thinking can be illustrated by using the cognitive planning domains mentioned earlier (see figure 1).
For example, in the art institute, planners might develop their problem statement; in the laboratory, their courses of action (with war games stress-testing their courses of action); and in the factory, publish orders. If this process is cyclical, when might planners and decision-makers reexamine their problem statement to determine if their experiences during execution might cause them to modify their understanding of the problem they should be solving?8 How often do planners get to the orders production step of MDMP and never circle back to reexamine their initial problem statement to determine its relevance and, if relevant, if the problem had been solved? Or worse, how many planners never develop an initial problem statement incorporated into their assessment process that is iteratively examined in the light provided after execution?
These questions highlight that problem identification is only one of many ways our planning processes can be degraded. Often, leaders encourage planners to engage in the directed course of action due to time constraints. When they do, they may be missing a leader development opportunity. The directed course of action not only removes the depth and breadth of understanding that MDMP can provide but also the leader development aspect of teaching the next generation of planners how to balance the art of command with the science of control (see figure 1).
By examining our planning process through the lens of the cognitive planning domains, military professionals may find ways to not only improve the process but also the outcomes. For example, many planners intentionally skip steps in MDMP and justify it because of a lack of time, which may be counterproductive; if understanding is the main goal of this process, perhaps skipping steps cannot be justified. If leaders become totally dependent on the directed course of action, how will emerging leaders learn to balance the science of control with the art of command?
During the U.S. Civil War, combat was referred to as “seeing the elephant.”9 Many in today’s Army have seen the elephant, and senior leaders need to prepare their subordinates for an uncertain future facing yet unknown opponents. Passing on to the next generation the ability to anticipate what is coming next through a balance of art and science might equip them to seize and maintain the initiative. Building visualization skills is the key to preparing emerging leaders for their turn at seeing the elephant.
Visualization is both an individual and a collective process. Our ability to visualize has a direct correlation to the quality of our plans and helps us anticipate some of the possibly unexpected events and then take steps to minimize their effects. Playing a simple role-playing war game like Kriegsspiel allows leaders a low cost and simple method of developing one set of skills necessary for successful planning as they develop their subordinate leaders. Of course, playing simple board games is not an answer in and of itself, but participating in them selectively allows soldiers to try a course of action, see the outcome, and then vary their next attempt, learning from each repetition to see what works. Low cost, simple to run, and able to support visualization, analog games could be part of the answer to preparing future leaders for uncertainty and ambiguity.
The question that commanders should answer is, if you could improve your unit leader’s visualization skills through simple analog games, why wouldn’t you? The functional area strategists attending CGSC have been introduced to Kriegsspiel and a variety of other simple board games that can be adapted for use in a variety of settings. They have a wealth of knowledge to help commanders improve their staff officers and noncommissioned officers.
This article was previous published as a Military Review online exclusive in October 2018.
Notes
- Epigraph. John Godfrey Saxe, “The Blind Men and the Elephant,” in The Poems of John Godfrey Saxe (Boston: J. Osgood, 1872), 260.
- Ibid.
- Center for Army Lessons Learned (CALL), “(FOUO) CTC Trends,” Combat Training Center Trends 8-3 (Fort Leavenworth, KS: Combined Arms Center [CAC], 2007); CALL, “(FOUO) CTC Trends,” Combat Training Center Trends 9-18 (Fort Leavenworth, KS: CAC, 2007); CALL, “(FOUO) NTC Rotation Report,” Unit Trends during Exercises, No Rotation 00-01 (Fort Leavenworth, KS: CAC, 1999); CALL, “(FOUO) NTC Rotation Report,” Unit Trends during Exercises, No Rotation 00-01 (Fort Leavenworth, KS: CAC, 2004); CALL, “(FOUO) NTC Rotation Report,” Unit Trends during Exercises, No Rotation 00-01 (Fort Leavenworth, KS: CAC, 1999); CALL, “(FOUO) NTC Rotation Report,” Unit Trends during Exercises, No Rotation 00-01 (Fort Leavenworth, KS: CAC, 1997).
- Peter P. Perla, “Once and Future Kriegsspiel” (PowerPoint presentation, Center for Naval Analysis, Arlington, VA, 2013), slide 16.
- Jonathon Keats, “Let’s Play War: Could War Games Replace the Real Thing?,” Nautilus, 24 September 2015, accessed 28 September 2018, http://nautil.us/issue/28/2050/lets-play-war.
- Richard McConnell et al., “The Effect of Simple Role-Playing Games on the Wargaming Step of the Military Decision Making Process (MDMP): A Mixed Methods Approach,” Developments in Business Simulation and Experimental Learning: Proceedings of the Annual ABSEL [Association for Business Simulation and Experiential Learning] Conference 45 (2018), accessed 5 October 2018, https://journals.tdl.org/absel/index.php/absel/article/view/3200/3127.
- Ibid.
- The findings described as “statistically significant” employed statistical tests indicating a 5 percent probability that the outcome was result of random chance. In other words, there was a 95 percent probability that this outcome indicated a replicable trend.
- Dale F. Spurlin, “The Problem Statement—What’s the Problem?,” Small Wars Journal, 6 August 2017, accessed 28 September 2018, http://smallwarsjournal.com/jrnl/art/the-problem-statement-%E2%80%93-what%E2%80%99s-the-problem.
- Ibid.; Jonathan R. Allen, “Seeing the Elephant—How It Feels to Be Under Fire,” The Civil War: Civil War History and Stories, accessed 28 September 2018, http://www.nellaware.com/blog/seeing-the-elephant.html.
Lt. Col. Richard A. McConnell, U.S. Army, retired, is an associate professor in the Department of Army Tactics, U.S. Army Command and General Staff College at Fort Leavenworth, Kansas. He received his BA from the University of Wisconsin–Milwaukee in 1989 and served twenty-five years in the U.S. Army in artillery units in Europe, the Middle East, and the United States. He received his doctor of management in organizational leadership from University of Phoenix, where his dissertation was an institutional microethnographic examination of the staff group advisor role at the Command and General Staff College at Fort Leavenworth.
Lt. Col. Mark T. Gerges, U.S. Army, retired, is an associate professor in the Department of Military History, U.S. Army Command and General Staff College at Fort Leavenworth, Kansas. He received his BA from Norwich University, Northfield, Vermont, in 1984 and served twenty years in U.S. Army armor units in Europe, the Balkans, the Middle East, and the United States. He received his PhD from Florida State University, where his dissertation was on the command and control of the British cavalry under the Duke of Wellington during the Peninsula War of 1808–1814.
Back to Top