The Ostrich Complex and Leadership in CrisisMacArthur 2018 1st

Lt. Col. Kevron W. Henry, Jamaica Defence Force

Download the PDF depuy

Original graphic elements by Kirsty Pargeter and Miguel Angel, Composite graphic by Arin Burgess, Military Review

The ostrich has been accused of hiding its head in the sand when frightened. Presumably he thus avoids seeing the cause of its fright. Presumably he also avoids seeing what the other ostriches are doing.

—Alvin B. Rubin and Elven E. Ponder


Concepts of effective leadership during crises are generally understood but often difficult to execute. Difficulty in concept execution results from the significant effort that is required of a commander to impose his or her mental acuity and will in order to solve a particular problem and to ensure mission success. On the fault lines of modern-day conflict, there are various knowledge management processes for commanders and other leaders. These processes are fed by information management systems that are designed to assist commanders and staff by providing a structure for them to process and communicate relevant information and make decisions.1 Despite these processes, unexplained disruptions in their flow have led to leadership and operational setbacks; these disruptions can be categorized as examples of the “ostrich complex.” The ostrich complex is defined as the disruption of a decision-maker’s knowledge management processes that results in a paralysis of active leadership or state of inertia, with a subsequent distinct negative effect on the outcome of a specific operation. This complex, therefore, requires early identification and mitigation in order to prevent systematic failures.2 The actions of Maj. Gen. Alan Jones and the U.S. Army’s 106th Infantry Division during World War II’s Battle of the Bulge provide historical context for the ostrich complex in large-scale combat operations. An Operation Enduring Freedom drone incident in 2010 provides a variation of the complex as it relates to sensory overload and false data-induced confidence in multi-domain operations.

The utilization of ostrich-related themes and terminology in both professional and popular culture is tied primarily to the unscientific belief that ostriches, as big and powerful as they are, bury their heads in the sand in order to hide themselves from perceived danger.3 On the contrary, ostriches bury their eggs in the sand and routinely lower their heads to check on them, thereby giving the impression that their small heads have totally disappeared.4 Scientific realities aside, the description has firmly embedded itself in the collective lexicon as a synonym for the either deliberate or unexplained hiding from one’s fears or perceptions. In the legal profession, it is known as the “ostrich instruction,” which refers to a defense’s concept of a client’s willful “blindness.”5 In both the financial and health-care sectors, it is the ostrich effect, and in international relations, it is the ostrich doctrine. However, all variations across the disciplines are tied to the concept of avoidance and the individual or collective complexes built upon the foundation of fear.

Carl Gustav Jung posited that a “complex” is a system of interrelated, usually repressed, emotionally charged ideas, feelings, memories, and impulses, that if allowed avenues to vent, can disrupt the normal links in the human consciousness; and as a result, the intentions of the will are impeded or made impossible.6 Feelings of self-doubt, confusion, fear, ambition, willful ignorance, and the consequence of accountability are ever present as part of the human condition, and all of these feelings encompass possible complexes. The commander and staff who experience the ostrich complex therefore become mentally burdened. Subsequently, their ability to rapidly and accurately portray the meaning and the necessary level of information that helps the commander maintain situational understanding and update their visualization is paralyzed. This ostrich-type behavior will continue to the detriment of their unit unless the commander is able to fight through the emotional white noise and make a balanced decision.7

War is a fundamental unchanging human endeavor that violently pits opposing forces against each other as a result of “fear, honor, and the pursuit of interest.”8 Within this construct, commanders and staffs of opposing forces play a high-stakes cognitive chess game in which each searches for an advantage that will enhance their own probability of success. Carl von Clausewitz posited that “if the mind is to emerge unscathed from this relentless struggle, two qualities are indispensable, coup d’oeil or an intellect that even in the darkest hour retains some glimmerings of an inner light and second is determination.”9 However, what happens when our processes are disrupted by the ostrich complex and the light fades?

Maj. Gen. Alan Jones was the commanding general of the U.S. Army’s 106th Infantry Division during World War II; the division was nicknamed the Golden Lions. Jones was highly regarded with a distinguished record of a long and venerable service that stretched back to World War I. However, despite his experience and service, it is his indecision and inaction during the Battle of Saint Vith that led to the disintegration of the 106th as an effective fighting force.

In December 1944, the 106th, freshly arrived in the European theater of operations, was ostensibly sent to a low-risk area of the front line. At the “ghost front,” as it was known, the main enemies were perceived to be the dreaded trench foot syndrome and the cold weather.10 The 106th’s area was in the Ardennes region of Belgium in the vicinity of the critical transportation hub at Saint Vith and not too far from the German border. However, what the 106th was unaware of as it occupied its foxholes was that Saint Vith would be the focal point for a planned German offensive.

On 16 December 1944, the dynamic and intense launch of the German offensive changed the ghost front into a frenzied battlespace in a matter of moments. From the very first artillery barrage, it became evident that Jones and his staff were already failing at the high-stakes’ cognition game. Unclear reports, loss of communications with frontline units, slow decision-making, and a seeming lethargy pervaded the 106th Division’s headquarters.11 Instead of seeking a more enhanced situational awareness in order to make relevant decisions, Jones sat in his command post and “waited for some word from his corps commander.”12 Later, two of the 106th’s regimental commanders, Col. George L. Descheneaux (422nd Regiment) and Col. Charles C. “Moe” Cavender (423rd Regiment), deliberated with each other after receiving numerous indecisive and inconclusive messages from their division headquarters. They subsequently decided that despite their tenuous situation, until division told them definitively to move, they were staying right where they were; the lethargy had spread.13 It was a fateful decision because when the message to “withdraw from present positions if they became untenable” was later received, it was far too late, and the regiments were overrun.14

Brig. Gen. Bruce Clarke’s unit was sent to assist the 106th, and when he arrived and observed the situation at the 106th Division headquarters, he provided a dire assessment. In Clarke’s opinion, not only was Jones not functioning in a clear and decisive manner, but his indecision had also affected the staff.15 In a later conversation, Jones told Clarke directly, “I’ve thrown in my last chips; you take over the defense of St. Vith.”16 In hindsight, this was Jones’s most decisive action throughout the battle. Clarke took over the battle and was able to effectively manage the chaos and salvage a perilous situation.

Jones’s actions during the Battle of Saint Vith can be attributed to the ostrich complex. His erratic behavior and overall lack of active leadership ran counter to his previously highly rated performances in command and as a staff officer. Throughout the battle, Jones was seemingly preoccupied by the realization that his first real fight as a division commander had resulted in the loss of two regiments and possibly his own son, who was serving with one of them. His behavior further indicated that the conflicting data and fear of repercussions seemingly swirled in his consciousness and forced him to psychologically retreat. The ostrich complex also paralyzed his staff and subordinates and led to a cohesive loss of mission focus, operational initiative, and ultimately an unnecessary loss of life. As a result of the 106th’s chaotic actions (or inaction) during the battle and its subsequent disintegration as a cohesive fighting force, Jones and numerous members of his command team and staff later left the battle of Saint Vith in ignominy.

In the modern-day multi-domain battlespace, there has been an exponential increase in information available to commanders compared to the battlefields of World War II.17 There is now a virtual torrent of data gathered from a plethora of sensors that feed nonstop information for enhanced situational awareness into various information management systems.18 This proliferation of mass data has served to paralyze commanders on both sides of the leadership coin.19 There is the danger of too much available data, but paradoxically, the existence of that data also compels commanders and staffs to seek out more data in order to enhance their visualization and battlespace management.20 This constant search for enhanced situational awareness by commanders and staffs leads to leadership paralysis as a consequence of simply having too many choices.21 In order to prevent cognitive overload, effective network management is therefore key to filtering the torrent of raw data into a steady stream of manageable information.

On 21 February 2010, during Operation Enduring Freedom, a seemingly routine cordon and search mission involving multiple sensors, weapons systems, and supported by personnel across continents, unfolded in Uruzgan Province, Afghanistan.

An official investigation launched in the aftermath of the incident outlined the following in the official report:

On 21 February 2010, up to 23 local Afghan nationals were killed and 12 others injured when the convoy they were travelling in was mistaken for an insurgent force and engaged with air to ground fire … initial observations appeared to indicate a threat force. The ODA commander on the ground displayed tactical patience in letting the situation develop over several hours before the engagement. The time brought by that patience was however wasted because of the Predator crew’s inaccurate reporting and the failure of both command posts to properly analyze the situation and provide, control, insights, analysis or options to the ODA commander … The tragic loss of life was further compounded by a failure of the commands involved to timely report the incident.22

Evidently the Predator flight crew reportedly ignored or downplayed information outlining that the convoy was anything other than an attacking force.23 However, the information provided was supposed to have been vetted through multiple knowledge management systems at other headquarters where commanders were supposed to complete a long checklist before authorizing an attack.24 In this instance, the false confidence generated by an overreliance on the various sensors and systems and imbued with the commanders’ own complexes and biases provided false situational awareness. This false positive thereby facilitated a further example of the ostrich complex where the commanders’ “misperception and misinterpretation of the data” caused a paralysis of leadership and led to the unfortunate loss of life.25

In the modern-day battlespace, the art of command requires leaders to acknowledge and manage greater expectations in exercising authority and accepting greater responsibility for their organizations.26 With that greater expectation and authority, there is also an increasing torrent of data, gathered from an ever increasing number of sensors.27 There are various knowledge management processes designed to assist commanders and staff by providing them with an enhanced cognitive and situational advantage. However, the ostrich complex disrupts these processes, forcing designated commanders to retreat into their own consciousness and take a proverbial knee. This pause can be optimal under stressful conditions in order for the commander to check the “eggs” and seek clarity. However, the complex has to be quickly identified and mitigated in order to prevent commanders from burying their decisions further into the sand to the detriment of the mission.


Notes

 

  1. Army Doctrine Publication (ADP) 6-0, Mission Command, Command and Control of Army Forces (Washington, DC: U.S. Government Publishing Office, 31 July 2019), 3-8.
  2. Dictionary.com, s.v. “complex,” accessed 7 March 2020, https://www.dictionary.com/browse/complex.
  3. Garrett Hardin, The Ostrich Factor: Our Population Myopia (Oxford, UK: Oxford University Press, 1999), 1.
  4. Ava McVean, “Ostriches Do Not Really Stick Their Heads in the Sand,” Office for Science and Society, 20 August 2017, accessed 7 March 2020, https://www.mcgill.ca/oss/article/did-you-know/ostriches-do-not-really-stick-their-heads-sand.
  5. Ira Robbins, “The Ostrich Instruction: Deliberate Ignorance as a Criminal Mens Rea,” The Journal of Criminal Law and Criminology 81, no. 2 (Summer 1990): 191–234, accessed 16 March 2020, https://www.jstor.org/stable/1143906.
  6. Carl Jung, The Essential Jung: Selected Writings Introduced by Anthony Storr (Princeton, NJ: Princeton University Press, 2013), 38.
  7. ADP 6-0, Mission Command, 3-15.
  8. H. R. McMaster, “The Pipe Dream of Easy War,” New York Times (website), 20 July 2013, accessed 11 January 2020, https://www.nytimes.com/2013/07/21/opinion/sunday/the-pipe-dream-of-easy-war.html.
  9. Carl Von Clausewitz, On War, ed. and trans. Michael Howard and Peter Paret (Princeton, NJ: Princeton University Press, 1984), 102.
  10. Charles Whiting, Death of a Division (New York: Stein and Day, 1980), 21.
  11. Ibid., 49.
  12. Ibid.
  13. Ibid., 75.
  14. Ibid., 76.
  15. Ibid., 78.
  16. Ibid., 80.
  17. Ajit Maan, Narrative Warfare (self-pub., CreateSpace, 2018), 9.
  18. Thomas Shanker and Matt Richtel, “In New Military, Data Overload Can Be Deadly,” New York Times (website), 16 January 2011, accessed 7 March 2020, https://www.nytimes.com/2011/01/17/technology/17brain.html.
  19. Ibid.
  20. Amber Corrin, “Sensory Overload: Military is Dealing with a Data Deluge,” The Business of Federal Technology, 4 February 2010, accessed 7 March 2020, https://fcw.com/articles/2010/02/08/home-page-defense-military-sensors.aspx.
  21. Barry Schwartz, “The Paradox of Choice,” YouTube video, posted by “TED,” 5:38, 16 January 2007, accessed 3 August 2020, https://www.youtube.com/watch?v=VO6XEQIsCoM.
  22. Timothy P. McHale, “Executive Summary for AR 15-6 Investigation, 21 February 2010 CIVCAS Incident in Uruzgan Province” (Kandahar Airfield, Afghanistan: United States Forces-Afghanistan, 2010), accessed 7 March 2020, https://www.aclu.org/files/dronefoia/uruzgan/drone_uruzgan_attachtabA_part_01_FOIA_10-0218.pdf.
  23. Ibid.
  24. David Zucchino, “U.S. Report Faults Air Force Drone Crew, Ground Commanders in Afghan Civilian Deaths,” Los Angeles Times (website), 29 May 2010, accessed 7 March 2020, https://www.latimes.com/archives/la-xpm-2010-may-29-la-fg-afghan-drone-20100531-story.html.
  25. Craig Martin, “A Means-Methods Paradox and the Legality of Drone Strikes in Armed Conflict,” The International Journal of Human Rights 19, no. 2 (2015): 142–75, https://doi.org/10.1080/13642987.2014.998864.
  26. Thomas G. Bradbeer, “Lethal and Non-Lethal Fires: Historical Case Studies of Converging Cross-Domain Fires in Large-Scale Combat Operations,” Military Review 98, no. 5 (September-October 2018): 26–32, accessed 30 July 2020, https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/September-October-2018/Bradbeer-Lethal-Nonlethal/.
  27. Thom Shanker and Matt Richtel, “In the New Military Data Overload Can Be Deadly,” New York Times (website), 16 January 2011, accessed 7 March 2020, https://www.nytimes.com/2011/01/17/technology/17brain.html.

 

Lt. Col. Kevron W. Henry, Jamaica Defence Force, was a student at the Command and General Staff College (CGSC), Fort Leavenworth, Kansas, academic year 2019/2020. He holds a BSc and an MSc from the University of the West Indies, Kingston, Jamaica; an MSc from the University of Leicester, United Kingdom; and an MMAS from the Command and General Staff College. His operational assignments have been primarily domestic and in the Caribbean region.

 

Back to Top

November-December 2020