Journal of Military Learning

The Reflective Military Practitioner

How Military Professionals Think in Action

Col. Christopher R. Paparone, PhD, U.S. Army, Retired
Col. George Reed, PhD, U.S. Army, Retired

Download the PDF

We shall not cease from exploration
And the end of our exploring
Will be to arrive where we started
And know the place for the first time.

—"Little Gidding," T. S. Eliot

Previously published as “The Reflective Military Practitioner: How Military Professionals Think in Action” in Military Review 88, no. 2 (March-April 2008): 66–76.

Volatility, uncertainty, complexity, and ambiguity characterize the contemporary operational environment (COE), requiring military professionals to continuously reflect on the roles, norms, and values of their craft.1 An apparent accelerated rate of change in the security environment makes it increasingly difficult to predict national security opportunities and threats, and the skills and capabilities needed to address both.2 Operations Iraqi Freedom and Enduring Freedom have demonstrated the need for rapid change in tactics, techniques, and procedures and our overall approach to campaigning. They have proven that the more complex the COE, the more the body of professional military knowledge must remain in a state of purposeful instability.

One can define “professional knowledge” as information that members of the profession believe provides meaning and value in promoting understanding of how things work in their field.3 A profession constructs and shares its unique body of abstract knowledge through social processes. Over time, the existing body of knowledge and the ongoing socioprofessional processes that create and maintain it come to constitute paradigmatic thought, a model of effectiveness.4 As theorist Donald Schön has observed, the network of experts and organizational leaders and the clients they serve who accept this model believe the paradigm to be so unique that laymen can neither understand nor apply it.5

Don Snider of the U.S. Military Academy deserves credit for renewing interest in the notion of the Army as a professional institution. Snider rightly raises a number of questions about the state of the profession. In two editions of The Future of the Army Profession, Snider and his coauthors express concern over the degree to which bureaucratic hierarchy is supplanting professionalism.6 Through these edited works we are reacquainted with the essential elements of professions, specifically, that they are “exclusive occupational groups applying somewhat abstract knowledge to particular cases.”7 It is hard to overemphasize the importance of abstract knowledge to professions. Snider argues that healthy professions deliberately control and develop their bodies of knowledge to service their clients and to compete for dominance in a professional jurisdiction.

If the military were to lose society’s trust in its ability to apply its unique form of knowledge, or if it should fail to differentiate itself from other groups that provide similar services, it would also lose some of the autonomy granted to it as a profession. In one of the classic works on professions, Andrew Abbott calls abstract knowledge the “currency of competition between professions.”8 Snider confirms this when he says, “The coins of the professional realm are expertise and the knowledge underlying it.”9 Reflective practitioners and good stewards of professions encourage habits in themselves and subordinates that develop and improve a profession’s underlying body of knowledge. In this article we examine the means by which the Army develops, maintains, and judges its body of abstract professional knowledge. Our conclusion is that practitioners and good stewards of the profession apply what Schön describes as “reflective practice.”10

The military contributes to, and draws upon, several traditional repositories of professional knowledge, including doctrine, journals, magazines, published assessments, and various meetings and conferences. The advent of web-based knowledge forums and electronic mail has opened up both formal and informal collaborative opportunities. Robust interaction with peers, subordinates, and superiors engaged in training and operations, or in research and education, ensures the professional military body of knowledge remains in an ongoing state of flux and transformation.11

Yet, despite these visible signs of flux and transformation, few have written about how the knowledge process works. How is a professional body of knowledge transformed? How should professionals reflect on their knowledge? How should they judge the quality of the professional body of knowledge? What are the implications for the profession’s senior leaders and clients? Answers to these questions are important to military professionals and senior leaders, to research and education institutions, and to Congress in its oversight role.

How Professional Knowledge is Transformed

Educational theorist David A. Kolb developed one of the most intuitively appealing theories of knowledge to assess students’ learning styles. Today, the U.S. Army Command and General Staff College uses his archetype to promote professional military education.12 Kolb’s “experiential” learning model presents a complex view of knowledge formation. Although Kolb developed his model to provide insights into how normal individuals learn from experience, his theory has clear application as a vehicle for thinking about professional knowledge development. His four-stage framework recapitulates how bodies of knowledge are continuously grasped and transformed.13 At various levels of internalization—from a tacit state of apprehension to a consciously knowing state of comprehension—knowledge transforms through active experimentation, concrete experience, reflective observation, and abstract conceptualization. The last phase constitutes a generalization of technique to be applied to future experience.

Kolb describes four forms of knowledge that appear at various stages in the process of professional knowledge formation and reformation: divergent, accommodative, convergent, and assimilative.14 Let us examine Kolb’s theory and consider how social processes contribute to changes in the professional body of knowledge over time.

Divergent knowledge. Divergent knowledge is gained from reflective observations of experiences by participants who come from an assortment of disciplines, professions, and occupations. They bring diverse roles, norms, and values together for a common interest, usually motivated by a shared realization that they face complex or chaotic situations where old knowledge is no longer sufficient.15 In some cases the situation confronted is so different and challenging and the existing perspective is so inadequate that it necessitates a new frame of reference and model of effectiveness—a paradigm shift.16 In this case, the eclectic participants are linked by their thirst for new knowledge, perceived by them as necessary for setting new conditions, perhaps for an emerging profession. They work to reconstruct reality by developing new, sometimes radical frames of reference.17

At this point, new professional roles, norms, and values are only loosely defined because learning categories and their interrelationships are exploratory. Informal groupings of like-minded leaders from varying backgrounds come together, all attempting to grapple with an indefinable state of knowing. For example, the Army’s Louisiana Maneuvers of 1941 may have been a critical rally point for a group of diverse thinkers who helped transform a cavalry-based Army into a motorized Army.18 The quality of professional relationships at this stage is important. Nondefensive interpersonal communications, shared trust, commitment, and enduring optimism are critical to offset the stress and anxiety associated with exploratory learning and the ever present risk of surprise and failure.19 During this period of formation, alternative professional viewpoints emerge.

Accommodative knowledge. Based on shared concrete experiences and active experimentation, accommodative knowledge emerges when newly forming professional networks begin to extend more intuitive kinds of knowledge into forms that entertain new assumptions and beliefs on a broader scale. Professionals begin the process of examining the otherwise unexaminable when they combine concrete experience with action research (i.e., dynamic experimentation).20 This activity requires flexibility of thought (e.g., temporarily suspending disbelief in other ways to frame or make sense of the COE) while accepting more unstructured and intangible ways of active inquiry (e.g., developing awareness about dealing with an active insurgency in Iraq when known technology does not seem to be effective).21 In this stage, active experimentation is vital to learning. As experience with highly complex and unique situations develops from experimentation and trial and error, a growing sense develops that existing technology is inadequate.

Convergent knowledge. Convergent knowledge is knowledge that coalesces as the emergent network begins to make sense of the world in a collective way and passes this knowledge to other members. Thus, highly abstract concepts transform into realizable knowledge goals and objectives that can be institutionalized as technical comprehension.22 Institutional performance depends on this more understandable and evaluated professional knowledge about cause-and-effect relationships. The institution begins to formulate rules and structure to gain control over the growing body of knowledge so that convergent knowledge can be more efficiently shared. New specialist categories form or old ones renew.23 For example, the Army developed its Special Forces (SF) around divergent knowledge about fighting proxy wars in the 1950s, but it did not consider SF worthy of a separate branch until thirty years later.24 Case studies, readings in theory, and time to reflect on one’s current context and recent activity are helpful to test convergent knowledge in education and research endeavors.

A negative aspect of convergent knowledge is that the uncritical or naïve practitioner may help perpetuate a “cultural myth” as dogma rather than facilitate self-correction of the professional body of knowledge.25 Continuous professional reflection and application of good habits in critical thinking help members sustain the body of knowledge. They also help the profession’s societal clients make sense of a rapidly changing environment.

Professionals understand that convergent knowledge is a temporary state and work to prevent the body of knowledge from becoming stagnant, blinding all concerned from a more insightful future construction of reality that is always around the corner. U.S. Joint Forces Command “pre-doctrinal” pamphlets and Army interim field manuals are examples of convergent knowledge that extends beyond a shared sense of apprehension and emerges as a more interpretable, shared comprehension.26

Assimilative knowledge. We see assimilative knowledge when it is transformed into institutionalized technology; for example, in the form of records, rules, doctrine, textbooks, approved lessons learned, programs of instruction, and other structures that begin to modify roles, norms, and values within the community.27 In the military’s case, tasks, conditions, and standards of work technology become routinized; they are enforced by the profession and, eventually, by the institution’s bureaucratic hierarchy and rule structure.28 The irony here is that an inherent inertia develops. An institution often overvalues the overt qualities of assimilative knowledge and creates bureaucratic or mechanistic structures that stifle innovation, thereby crippling professional progress. Aspects of more intuitive divergent and accommodative knowledge explorations go orphaned.29

Overly structured training, hierarchically supervised professional military educational programs, extensive procedural rules designed to standardize job performance, and other strictures can create an intractable situation, a procrustean bed that bars divergent and accommodative knowledge from the field and leads to the dismissal of research outcomes. Programmed knowledge appeals to senior managers because of perceived certainty derived from institutionalized metrics frequently associated with technology. Routine and habit are the hallmarks of technocratic bureaucracies. Such comfortable standardization possesses an attraction that devalues divergent alternatives.

There is a way to address this propensity to engineer assimilative knowledge. Professionals should avoid scientizing and reifying assimilative knowledge at inappropriate levels of discourse.30 When reification occurs, “the way things get done around here” becomes “the only way to do things around here,” resulting in a serious obstacle to knowledge production.31 To put it still another way, professionals must be cautious not to take for granted this seemingly settled body of knowledge about technical cause-and-effect relationships. As they practice the profession, they should continuously uncover and question the unseen underlying apprehension that still exists from the divergent stage and take action to confirm or change their apparent technical comprehension. As implied by the title of this article, this continuous professional inquiry is called reflection-in-action.32

Reflecting on Professional Knowledge

Effective professionals realize that assimilative knowledge can be the most difficult to challenge because its meaning and use can appear so rational as to be technically unquestionable. Overcoming what amounts to a myopic belief in assimilative knowledge is even more difficult because intuitive logic (the hallmark of accommodative and divergent knowledge forms) can be nearly impossible to articulate.33 According to Schön, the apparent validity and infallibility of technical rationality constitute a “competency trap” in which unquestioned belief creates less effective professionals who become the “self-serving elite who use science-based technique” as their “masquerade of extraordinary knowledge.”34 Technical rationality is a perspective that assumes complete knowledge of cause-and-effect relationships based in principles originally derived from Cartesian philosophy.35 This sense of “rationality” errs by applying Newtonian scientific method to abstractions; in essence shoehorning discourses of physical science into the understanding of conceptual mental processes. George Bernard Shaw once defined this trap as a dangerous façade that can be created by use of assimilative jargon, a phenomenon he described as a “conspiracy against laity.”36 For Schön, the cure for unquestioned belief in technical rationality is professional reflection-in-action that is “central to the ‘art’ by which practitioners sometimes deal well with situations of uncertainty, instability, and value-conflict.”37 In addition,

a practitioner’s reflection can serve as a corrective to overlearning. Through reflection, he can surface and criticize the tacit understandings that have grown up around the repetitive experiences of a specialized practice, and can make new sense of the situations of uncertainty or uniqueness, which he may allow himself to experience.38

Schön makes a strong case that technical rationality can dominate professions to the point that members lose track of the interdependent complex interactions that make each case unique. Professionals become

locked into a view of themselves as technical experts, [and they] find nothing in the world of practice to occasion reflection. They have become too skillful at techniques of selective inattention, junk categories, and situational control techniques, which they use to preserve constancy of their knowledge-in-practice. For them, uncertainty is a threat; its admission a sign of weakness. Others, more inclined toward and adept at reflection-in-action, nevertheless feel profoundly uneasy because they cannot say what they know how to do, cannot justify its quality or rigor.39

Note the ironic turn in Schön’s last sentence, where he suggests a requirement to accept uncertainty while recognizing the call for quality and rigor. Schön speaks to this tendency toward dogmatic simplification as follows:

When [the professional] is confronted with demands that seem incompatible or inconsistent, [he] may respond by reflecting on the appreciations which he and others have brought to the situation. Conscious of a dilemma, he may attribute it to the way in which he has set the problem, or even the way in which he has framed his role. He may then find a way of integrating or choosing among the values at stake in the situation.40

The complexity of the COE makes each situation contextually unique. Hence, true professionals have to reflect on what the profession may otherwise take for granted and understand how to challenge assumptions. This happens naturally when one sees assimilative knowledge as ineffective; then, the more intuitive divergent knowledge process gains value. In these cases, professionals become researchers-in-action, as professional learning becomes a complex process of adaptation in the midst of epistemic paradox.41 To Kolb, real professionalism involves considering the value of all types of knowledge simultaneously, no matter how contradictory they seem.42

The professional who reflects-in-action pays attention to, and acts on, the environment through paradoxical use of divergent, accommodative, and convergent forms of knowledge, especially when assimilative knowledge does not seem to be working. In that regard, stewards of the profession want the profession’s field practitioners and de facto researchers to be able to challenge role assumptions, normative beliefs, and established values in order to determine their relevancy for the reality they are facing. This challenge demands a soft heuristic (rule of thumb) process rather than a hard scientific one since the quality or aptness of a body of knowledge cannot be scientifically deduced in the same way Descartes applied Newton’s empirical methods to philosophy. Professional judgment requires the challenging of assumptions, even those behind the paradigmatic Westernized scientific view. It necessitates a philosophical perspective that embraces the possibility of divergence rather than an ideological perspective that seems to enshrine assimilative knowledge as objective certainty.43

In that regard, we see the purpose of officer professional development as not only teaching convergent and assimilative knowledge forms, but also creating opportunities for exploring and practicing judgment on divergent and accommodative knowledge.44 Additionally, we propose that military doctrine should reorient the professional community more on collaborative inquiry and collective judgment and lessen dependence on the convenient mythology of accepted technique or “best practices” passed down by authority with the stamp of “science” on them. Relying on the dogma of received wisdom founded on closed epistemic evaluations ultimately could serve to deprofessionalize the military through chauvinism.45

Assessing the Body of Knowledge

In a process that parallels reflection-in-action, professionals ideally judge and make sense of knowledge across a spectrum ranging from an unquestioned belief in the certainty of assimilative wisdom to a radical, divergent form of skepticism (see figure).46 Professionals appreciate and judge expert knowledge by acting all along the spectrum. At its best, in a process that entails paradoxical thinking while acting, a judgment appreciates opposing perspectives simultaneously.47


Figure. The Continuum of Judging Knowledge Involves Paradoxical Thinking

Professionals and stewards of the profession recognize that practicing the art of professional reflection-in-action is less risky in genuinely collaborative situations where learning is more valued than knowing.48 In hierarchical organizations, on the other hand, especially during crises, the pressure to conform to a professionally acceptable body of technical knowledge can be tremendous—we tend to value those who have the temerity to resist such pressures, but only if they are right.49 In that regard, Aaron B. Wildavsky’s concept of “speaking truth to power” can be one of the most heroic things professionals do.50 The profession should consider as courageous those who speak such truth to those in authority who are not receptive. It should judge as virtuous senior officials who allow and encourage the naked truth to be spoken freely to them.

Successful collaboration in a professional network across the stages of knowledge requires participants to appreciate existing opinions and arguments while striving to understand and appreciate new ones. This can be a challenge when those proposing the new approach have not yet developed sufficient language to fully describe what they are intuiting. Effective collaborative professional communities seek educated, well-thought-out judgments. They are skeptical of dogma characterized by unchallenged and unsubstantiated beliefs and equally suspicious of extreme doubting that bears no possibility of closure. Paradoxically, a professional social system supports both common and uncommon inquiry because they are the lifeblood of the profession’s body of knowledge, facilitating its accumulation and maintenance. Professionals should freely admit that they are unable to judge what they have not yet learned. Socratic wisdom rests on the admission that one does not know when and how the opportunity for learning will arise. The task of collaboratively shaping social interrelationships is anchored in the professional’s shared passion for knowledge—revealed in the sociological theory of roles, norms, and values.51 As repositories of knowledge, human beings (including professionals) develop roles, norms, and values as forms of knowledge through a socially constructed process.52

Roles. Roles are the most visible aspect of this social construction. They are standardized patterns describing the behavior required of all persons playing a given part in society. Roles can differentiate one organizational position from another. A role reflects the recurring actions of the individual playing it. It is appropriately interrelated with the repetitive activities of others so as to yield generally predictable outcomes. When individual roles are combined, people create a “social system” or “subsystem.” In the case of the military, role-playing is ubiquitous. Titles like commander, staff member, family support group leader, enlisted soldier, and staff college professor all represent visible, descriptive role categories.

Norms. Less visible social manifestations than roles, norms reflect the general expectations of role incumbents within a social system or subsystem. Norms imply or explicitly prescribe ethics that people interactively create and refer to in order to sanction behavior. As such, norms have a specific “ought” or “must” quality. Norms formally (through organizational procedures) or informally (through interpersonal relationships) shape the way roles are performed. Some examples we are familiar with include “commanders ought to be honest and fair;” “all officers are leaders;” “senior NCOs should speak for the enlisted population after getting to know them personally;” and “the military decision-making process (MDMP) is the best way to approach planning for U.S. Army full-spectrum operations.”

Values. The least visible of social manifestations, values are generalized ideological justifications for roles and norms. They express aspirations that inform what is required for action.53 Values are more culturally rooted than roles and norms, and they serve as the often unseen, frequently tacit backdrop that drives criteria for making judgments about knowledge. Like roles and norms, values may be espoused—stated deliberately and formally by the institution. The U.S. Army’s “Soldier’s Creed,” for example, is a bluff declaration of the values the Army wants its members to inculcate (“I will never quit. I will never leave a fallen comrade. I am disciplined, physically and mentally tough …”) On the other hand, values may be in use as cultural phenomena, passed from one generation to another as deeply hidden or tacit forms of assimilated knowledge.54 If the espoused values approximate or are equal to those in use, the profession can approach a state of social equilibrium among itself, the institution, and clients.


The U.S. Army Soldier’s Creed

Single- and double-loop learning. Harvard professor Chris Argyris refers to the process of sustaining assimilative knowledge, in which associated roles, norms, and values go unchallenged, as single-loop learning. In its worst form, the profession, institution, and clients all firmly believe that they will continue to be successful with the knowledge they have. Faith and certainty feed off each other in a continuous loop. Theoretically, in a more stable COE, this may be a successful strategy with which to judge knowledge (i.e., “it works, therefore why look for alternatives?”). However, this strategy is not considered viable in the midst of a perceived unstable COE with inherent fog and friction. As a remedy, Argyris describes double-loop learning, the ability to suspend deeply-held beliefs, no matter how successful they have been, in order to value alternative forms of knowledge (what Kolb termed “accommodative and divergent forms of knowledge”).55

Defensive routines. Even when professionals and institutional leaders embrace double-loop learning as the preferred strategy for judging knowledge, defensive routines can inhibit the process.56 Defensive routines are emotional responses to alternative beliefs, values, and assumptions about assimilative knowledge, and they discourage all but single-loop learning.57 A few notable examples of defensive routines include

  • Irony of success, a form of single-loop learning in which a reinforcing cycle of persistence causes leaders to “bask in past successes” and increase their collaboration with those of like mind, rather than recognize the need for change.”58 Psychologist Irving Janis called this like-mindedness and excessive desire for cohesion group-think. According to Chamu Sundaramurthy and Marianne Lewis, groupthink is “a pattern of collective defenses aimed at denying or suppressing tensions;” it is associated with a shared comfortable feeling about known technology.59 Repeated success can help build huge egos and contribute to a situation in which admitting that one can learn is tantamount to admitting weakness. In this case, Argyris concluded through his clinical research that “it can be especially difficult for smart people to learn not because they have little to learn but because they have a lot invested in appearing not to need to.”60
  • Faulty attribution, a process that works two ways: by blaming failure on a mythical belief or a scapegoat, or by taking (wishful) credit for success in a way that inspires overconfidence. Both cases reduce incentives to question the real causes of good or bad performance.61 In U.S. Army culture, for example, there is a tendency to attribute success or failure to the technologies of leadership and/or training when there may, in fact, be alternative explanations.62 The Army has a similar problem with nonattribution of its official doctrine (a written source of technology), which is published without proper citation of the sources of knowledge.63
  • Threat rigidity, also known as “hunkering down” or entrenchment. This mindset occurs when already-formed beliefs are retained in the face of conflicting information or even impending failure. Denying or marginalizing such disconfirming information results in psychological inertia, which is often accompanied by escalating commitment to the failing course of action. Using outsiders to assess new information and being open to their findings can help override this type of defensive routine.64 For example, the Army should seek alternatives to assimilative knowledge beyond the readily available pantheon of retired military officers engaged in defense consulting work and those associated with what President Eisenhower dubbed “the military-industrial complex.”65 Such quasi-insiders bring valuable knowledge about the inner workings and culture of the military, but they may find it difficult to provide the outsider’s view that could be more useful in countering threat rigidity.
  • Excessive use of bureaucratic controls, which occurs when management overuses performance metrics, rules, and regulations that squelch professional knowledge adaptation and increase the probability of transaction-style leadership.66 Professional problems often call for non-routine solutions. Yet routine solutions are observable in many organizations’ excessive use of management-by-objectives-type performance evaluations as well as statistical controls found in popular concepts such as “reengineering,” “balanced scorecard,” “Lean,” and “Six Sigma.” Excessive administrative controls on the use of known technology stifle experimentation and innovation; plus, they inhibit learning essential to the production of divergent and accommodative knowledge.67
  • Myopic decision-making. When decisions are tied to an inflexible set of criteria or a set technology, the result is myopic decision-making. In this mindset, learning usually entails comparing the results of a single course of action against potentially factitious standards, thus fueling low-risk, single-loop learning while “discouraging more frame-breaking innovations and change.”68 One could argue that the MDMP espoused by U.S. Army doctrine falls into this category.69
  • Impression management. In this defensive routine, the individual or organization fixates on a facade of performance. (In the case of the military, this is often a facade of readiness.) This mode privileges form over function, overlooking substantive performance. Impression management distorts communications and intensifies information asymmetries among hierarchical levels of organization, thereby inhibiting effective decision-making and fueling suspicions.70 Such masquerading amounts to a technology of deception.

Implications for Senior Leaders and Clients

When senior officials of the institution are also active members of the profession, they should function as stewards. According to Webster’s Unabridged Dictionary, a steward is “one called upon to exercise responsible care over possessions (time, talent, and treasure) entrusted to him.” Stewards of a profession are intrinsically motivated to act in the best interests of their clients. In the case of the U.S. military, we might describe the ultimate client as the American people constitutionally represented by elected and appointed officials. Good stewardship entails not only accomplishing assigned missions but also propelling the entrusted profession to new heights by setting conditions for the forms of knowledge outlined above to work eclectically, simultaneously, and without encumbrance.71

By providing opportunities to experiment and fail, effective stewards set the conditions for high-quality collaborative inquiry into divergent knowledge. Accepting thoughtful, open, and honest feedback, they encourage and share a passion for creativity among professionals.72 They appreciate the uncertain nature of divergent knowledge and the need to curtail preemptive, hierarchical-style decision-making where it is not warranted. Stewards learn to defer to and encourage those professional knowledge explorers who have the potential to be the artful framers of a transformed paradigm.73 The steward’s role is to help set conditions for action research with other professionals in the absence of the clarity, accuracy, and precision so appealing to the technically rational mindset.74 Under the right conditions, the professional practice of action research will occur naturally in the field during strategy sessions, operations, training, and educational opportunities.75 Action research, we argue, is essential to all levels for adaptation and survival in the COE.

One way those in senior institutional positions can best steward the accumulation of professional knowledge is by providing sufficient resources for experimentation. We should not underestimate the challenges such a goal presents. In the military, justifying budgets for exploring divergent knowledge could be considered cost-prohibitive. Moreover, the planning, programming, budgeting, and execution process calls for predictions of clearly identified problems, milestones, and technical solutions.76 Good stewards are aware that the emergent knowledge professionals report can prompt institutional bureaucrats to converge or assimilate it, entrenching with comforting myths while paying less attention to or summarily dismissing more divergent views.

Deciding too early on a course of action in the MDMP, the Joint Capabilities Integration and Development System, or in an acquisition system milestone approval process are examples of impulses to converge knowledge too quickly. The cultural propensity to employ analytical decision-making at early stages of knowledge development may prematurely close on possibly attractive solutions rather than allow accommodative knowledge to develop further. The wise steward fights the impulse to rush to cost-benefit analysis or ORSA-style decision-making when knowledge is in the process of being explored.77 Effective stewards of the military profession facilitate multiple perspectives and invite nonmilitary sources to develop theories, based on emergent forms, that enhance double-loop learning. They also convince their political clients to fight the impulse to suppress and under-resource activities in the divergent and accommodative stages of professional knowledge development. The steward’s shaping task, then, becomes a matter of not only encouraging professional action research and consideration of alternatives, but also reducing or eliminating defensive routines that might interfere with double-loop learning.78

In addition to dealing with systemic or culturally embedded defensive routines, the good steward of the profession ensures that a diversity of knowledge types is working simultaneously and that multiple perspectives are available. In short, the steward shapes conditions for critical evaluation of the profession’s corpus of expert knowledge.79

To recapitulate, the institutional conditions necessary to sustain the professional body of knowledge exist when—

  • Professional reflection is facilitated by valuing the processes that challenge assimilative knowledge (i.e., continuous truth seeking) and by embracing the inevitable conflict associated with truth seeking.
  • Professionals are encouraged to “speak truth to power” despite bureaucratic pressures to conform to a body of assimilative knowledge.
  • Double-loop learning and action research are institutionally valued processes whereby knowledge is created and reformed, and where the conditions are sometimes set for a complete paradigm shift.
  • Stewards of the profession set conditions for an institutional climate that enables patterned, sound judgments about the condition of divergent, accommodative, assimilative, and convergent professional knowledge.
  • Effective stewards help shape professional roles, norms, and values that set the conditions for all of the above.

Professional reflection-in-action requires free and open dialog, so that effective collaborative judgment across Kolb’s forms of knowledge can occur. Professionals who aspire to action-research practices should

  • Advocate positions as forthrightly as possible, but do so in a way that encourages others to question them.
  • Ask for a better-supported argument whenever someone states a disagreeable position or help the arguer better assess the position.
  • Use illustrative data and make lucid, cogent arguments when evaluating another person’s argument. Clearly articulated reason, rather than authority, should serve as the standard for assimilated knowledge.
  • Apologize if, in the process of professional discourse, you act in ways that appear to upset others. Assure them that this was not the intention (provided that is genuinely the case) and state the intent and the reasoning behind it.
  • Ask for the reasoning behind actions that you find upsetting, in order to understand the other’s intentions.80


The military profession’s health depends in no small part on the accumulation and maintenance of a specialized body of abstract knowledge. In this article we have argued that in a COE characterized by complex and rapid change, good habits of reflective practice are essential to adapt the professional body of knowledge effectively. To develop such practices, an understanding of how professional-knowledge social processes work is beneficial, especially for stewards of the profession. Good stewards of the profession set the conditions for collaborative inquiry and are appreciative of Kolb’s four-part framework of knowledge.


    Epigraph. T. S. Eliot, “Little Gidding” in Four Quartets (New York: Mariner Books, 1968).
  1. Some examples that agree include the anthology projects of Don M. Snider and Gayle L. Watkins, The Future of the Army Profession, 1st and 2d eds. (Boston: McGraw-Hill, 2002 and 2005); Samuel P. Huntington, The Soldier and the State: The Theory and Politics of Civil-Military Relations (Boston: Belknap, 1957); Andrew Abbott, The System of Professions: An Essay on the Division of Expert Labor (Chicago: University of Chicago Press, 1988); Andrew Brien, “Professional Ethics and The Culture of Trust,” Journal of Business Ethics 17, no. 4 (1998): 391–409.
  2. We say “apparent” because there is little reason to expect today’s COE to be any more complicated than it was, for example, in 1939, when Nazi Germany invaded Poland and the United States was ill-prepared for the coming world war. The same applies to the Korean and Vietnam wars, and during the Cold War, for that matter. Nevertheless, with the advent and potential proliferation of nuclear weapons and the potential for other weapons of mass destruction, we think the world is at least more dangerous than it ever has been. For a discussion of the propensities for current generations to believe they inhabit the most turbulent environment, see Henry Mintzberg, The Rise and Fall of Strategic Planning: Reconceiving Roles for Planning, Plans, and Planners (New York: The Free Press, 1989), 203–9.
  3. Adapted from Jeffrey Pfeffer, Organizations and Organization Theory (Cambridge, MA: Ballinger, 1992), 227–28. We would add (and argue in this essay) that knowledge is also about the way that things could or should work as well; hence, it can be unsettling.
  4. Thomas S. Kuhn, The Structure of Scientific Revolutions, 3rd ed. (Chicago: University of Chicago Press, 1996), 175. For Kuhn, “paradigm” “stands for the entire constellation of beliefs, values, techniques, and so on, shared by members of a given community.”
  5. Donald A. Schön, The Reflective Practitioner: How Professionals Think in Action (New York: Basic Books, 1983).
  6. Snider and Watkins, The Future of the Army Profession.
  7. Andrew Abbott, The System of Professions (Chicago: University of Chicago Press, 1988).
  8. Ibid, 9.
  9. Snider and Watkins, The Future of the Army Profession, 13.
  10. Schön, The Reflective Practitioner.
  11. For example, witness the joint community’s use of predoctrinal publications and the U.S. Army’s use of interim field manuals, both of which signify the near-impossible attempt to keep up with learning as it occurs in the field or in the schoolhouses. Quoting Greek philosopher Heraclitus, Gareth Morgan provides this metaphor of flux and transformation: “You cannot step twice in the same river, for other waters are continuously flowing on,” Gareth Morgan, Images of Organization (Thousand Oaks, CA: Sage, 1997), 251.
  12. David A. Kolb, Experiential Learning: Experience as the Source of Learning and Development (Englewood Cliffs, NJ: Prentice-Hall, 1984). The U.S. Army Command and General Staff College (CGSC), Fort Leavenworth, Kansas, has used Kolb’s model and assessment instruments to inform its educational philosophy, faculty development programs, and curriculum. One of the authors recently attended a weeklong CGSC faculty development program in which Kolb’s theory was applied to the seminar-style classroom.
  13. This continuum is also explained by Kolb as being linked to the left and right hemispheres of the brain—the left associated with comprehension and the right with apprehension (Kolb, Experiential Learning, 46–49). According to Michael Polyani, such tacit knowledge is “a way to know more than we can tell,” Polyani, Tacit Dimension (Garden City, NY: Doubleday, 1966), 18.
  14. Kolb, Experiential Learning,121–31. Kolb describes the process of social learning as “living systems of inquiry” and even links these forms to various careers, professions, or occupations. For example, he cites research that demonstrates possible linkages between the science of engineering and convergent knowledge preferences (science-based professions). Chemistry is associated with assimilative knowledge preferences; historians and psychologists with divergent knowledge; and, business people linked more to accommodative knowledge structures (more contextual in nature).
  15. Karl E. Weick, Sensemaking in Organizations (Thousand Oaks, CA: Sage, 1995).
  16. This essentially recapitulates Kuhn’s thesis about how scientific revolutions come about.
  17. For a detailed account that confirms this divergent process, see Mitchell Waldrop’s Complexity: The Emerging Science at the Edge of Order and Chaos (New York: Touchstone, 1992). Waldrop tells the story of how scientists from diverse fields of study formed the Santa Fe Institute, which established complexity science as a legitimate field of study.
  18. Generals Omar Bradley, Mark Clark, Dwight D. Eisenhower, George Marshall, and George Patton were all present.
  19. Karl E. Weick and Kathleen Sutcliffe, Managing the Unexpected: Assuring High Performance in an Age of Complexity (San Francisco: Jossey-Bass, 2001).
  20. Some have investigated the analogy of how ant colonies learn—where a kind of “swarm intelligence” emerges. See John H. Holland, Emergence: From Chaos to Order (Boston: Addison-Wesley, 1998). The concept of action research was developed in the 1940s by the late MIT social psychology professor Kurt Lewin, who turned away from a best-practices approach to solving complex social problems in favor of a dynamic, real-time method of theorizing while practicing, resulting in continuous personal and organizational development. His ideas have been further developed by a host of students of social psychology and organization theory. We see prosecuting the full range/spectrum of military operations as a corollary to solving complex social problems; hence, we suggest that action research would be an effective professional military methodology. Variations include action science, cooperative inquiry, and interactive social science.
  21. Here, we incorporate an encompassing definition of technology, defined as: “all the knowledge, information, material resources, techniques, and procedures that a work unit uses to convert system inputs into outputs—that is to conduct work.” Rupert F. Chisholm, “Introducing Advanced Information Technology into Public Organizations,” Public Productivity Review 11, no. 4 (1988): 39–56. We would add the term “tactics” as well to round out the definition in military terms. This definition implies that technology is a pre-existing solution to a given problem and that “technical rationality” is the reasoned application of it.
  22. Kolb, Experiential Learning, 97.
  23. For example, the Army Green Berets emerged out of the Kennedy administration’s perceived need for “graduated response” in the midst of proxy and guerrilla warfare in the COE of the 1960s. Special Forces has since grown in stature and numbers, and, combined with other special operations forces, has become part of a new unified command (U.S. Special Operations Command, Tampa, Florida). Today, we see a resurgence of irregular warfare doctrine from the 1960s—a brushing off of old knowledge.
  24. The Army Special Forces branch was established in 1987, more than forty years after the U.S. Office of Strategic Services in World War II recognized the need and established the beginnings of the requisite specialized knowledge. U.S. Special Operations Command was established as a joint combatant command about the same time.
  25. Harrison M. Trice and Janice M. Beyer, “Studying Organizational Cultures Through Rites and Ceremonials,” Academy of Management Review 9, no. 4 (1984). Trice and Beyer define “cultural myth” as “a dramatic narrative of imagined events, usually used to explain origins or transformations of something. It is also an unquestioned belief about practical benefits of certain techniques and behaviors that is not supported by demonstrated facts” (655).
  26. U.S. Joint Forces Command predoctrinal publications are available online at (accessed 12 September 2006). Army interim doctrine (Field Manual-Interim or “FMIs”) examples are available online at (accessed 13 September 2006).
  27. Kolb calls this process “organizing information” (Experiential Learning, 96).
  28. For example, see Chairman of the Joint Chiefs of Staff Manual 3500.04C, Universal Joint Task List (Washington, DC: Department of Defense, 1 July 2002), where the Joint Staff has codified tasks, conditions, and standards for four levels of war to an amazing level of detail.
  29. On the other hand, one notion of success with assimilative knowledge comes from valuing bricolage, or emphasizing resilience by forming new ways to accomplish things through the creative use of existing knowledge. Paradoxically, the improvised use of assimilated knowledge can be quite creative and result in a new divergent-accommodative-convergent cycle of knowledge creation in itself. See Karl E. Weick, “Improvisation as a Mindset for Organizational Analysis,” Organization Science 9, no. 5 (October 1998): 543–55.
  30. Karl E. Weick, The Social Psychology of Organizations, 2nd ed. (New York: McGraw-Hill, 1979), 34. “Reification” means “to treat an abstract concept as if it referred to a thing.”
  31. First quote is from Terrence E. Deal and Alan A. Kennedy, Corporate Cultures: The Rites and Rituals of Corporate Life (Harmondsworth, UK: Penguin Books, 1982), 4. Second quote in this sentence is our rendition.
  32. Schön, The Reflective Practitioner.
  33. See note 21 for our more complete definition of technical rationality.
  34. Schön, The Reflective Practitioner, 50. The term “competency trap” was described in detail by James G. March in A Primer on Decision Making, How Decisions Happen (New York: The Free Press, 1994), 96–97. For March, the trap “reflects the ways in which improving capabilities with one rule, technology, strategy, or practice interferes with changing that rule, technology, strategy or practice to another that is potentially superior.” Henry Mintzberg has also written extensively on how Defense management “whiz kids” have used “the cover of technique to promote their own influence.” According to Mintzberg: “The age of management has become the age of the ‘quick fix.’ Call in your technocrats, throw a lot of technique at the problem, drown in hard data ... [and make sure you] resolve it quickly so that you can get on with the next problem.” See Henry Mintzberg, Mintzberg on Management: Inside Our Strange World of Organizations (New York: Free Press, 1989), 356–57.
  35. For an excellent history of technical rationality, see Mark R. Rutgers, “Be rational! But what does it mean? A history of the idea of rationality and its relation to management thought,” Journal of Management History 5, no. 1 (1999): 17–36. Rutgers described the emergence of the Cartesian paradigm as follows: “Rationality becomes associated with the question of ‘how can humanity control nature and society.’ From Sir Francis Bacon’s adage ‘knowledge is power,’ it is evident that rationality became associated with method, especially, scientific method. In this regard, rationalism served as the most extreme edict of this development. The ‘rationalist’ school of thought claimed that all knowledge is ultimately based on reason, and reason alone. Thus, Enlightenment philosophers not only gave a new meaning to rationality, but also provided it with a significant social credibility: society can be improved by applying (scientific) reason. They not only define the dominant modern comprehension of rationality, but actually induce an explicit strife for the rational organization of everyday social life.”
  36. George B. Shaw, The Doctor’s Dilemma (New York: Penguin, 1946). Extract of this book, containing the quote, accessed from the International Journal of Epidemiology, Oxford University, accessed
    23 July 2006,
  37. Schön, The Reflective Practitioner, 50.
  38. Ibid., 61.
  39. Ibid., 69.
  40. Ibid., 68.
  41. Ibid.
  42. Kolb, in Experiential Learning, 1994, spends his last chapter discussing lifelong learning and makes it a point to metaphorically stress integrative knowledge in terms of “one foot on the shore of the conventions of social knowledge and one foot in the canoe of an emergent future” (225). He further states that “knowledge is refined by viewing predicaments through the dialectically opposed lenses of the four basic knowledge structures and then acting sensibly” (226).
  43. Alfred Tarski, “The Semantic Conception of Truth and the Foundations of Semantics,” in Logic, Probability, and Epistemology: The Power of Semantics, ed. Sahotra Sarkar, vol. 3 (New York: Garland, 1996), 1–35. Metaphysics, according to Tarski, is inquiry by means other than deduction or empiricism (23). The subjective-based study of the social construction of reality may be a form of such other means of inquiry.
  44. Chris Argyris and Donald A. Schön, Theory in Practice: Increasing Professional Effectiveness (San Francisco, CA: Jossey-Bass, 1980), 149. We see this as the fundamental purpose of professional military education and we advocate small-group seminar dialog to help achieve it.
  45. In the footsteps of Argyris and Schön, we believe every situation is unique to the military professional during complex operations; hence, best practices are a myth fueled by a sense of comfort associated with a belief in technical rationality. This also brings into question the way we process operational and tactical lessons learned with the hope that we find solutions that are assumed to be generalizable for the next time (i.e., a Cartesian mentality). We strongly dispute this method. While reading about best practices might serve to inform the practitioner of what others are doing, there is no substitute for acting and learning (i.e., for action research).
  46. Stephen C. Pepper, World Hypotheses: Prolegomena to Systematic Philosophy and a Complete Survey of Metaphysics (Berkeley, CA: University of California, 1942), 44. Pepper calls professional knowledge “expert knowledge.” We developed this chart (figure) from the ideas in Pepper’s discussion.
  47. For our explanation of paradoxical thinking, see Christopher R. Paparone and James A. Crupi, “The Principles of War as Paradox,” Proceedings 132, no. 10/1,232 (October 2005): 39–44.
  48. This is where the U.S. Army may have a problem in that the “Be, Know, and Do” framework of leadership (Army Field Manual [FM] 6-22, Army Leadership ) seems to overvalue assimilative knowledge and technical rationality and stresses associated “competencies.” Changing to a “Be, Learn, and Do” framework may more effectively demonstrate increased institutional valuation of the adaptive learning process. The word “competencies” invokes a sense of known knowledge. To recognize the continuous need to invent knowledge, we suggest that the Army’s competency framework would have to be changed to a professional reflection-in-action framework that would subsume competencies associated with assimilative knowledge and include divergent, accommodative, and convergent forms of knowledge.
  49. Karl E. Weick, “Drop Your Tools: An Allegory for Organizational Studies,” Administrative Science Quarterly 41 (1996): 301–13.
  50. Aaron B. Wildavsky, Speaking Truth to Power (New Brunswick, NJ: Transaction Publishers, 1987).
  51. Our discussion of the social construction of roles, norms, and values is derived primarily from Daniel Katz and Robert L. Kahn, The Social Psychology of Organizations, 2nd ed. (New York: John Wiley & Sons, 1978).
  52. Peter L. Berger and Thomas Luckmann, The Social Construction of Reality (New York: Anchor, 1966).
  53. Katz and Kahn, The Social Psychology of Organizations, 43.
  54. Edgar H. Schein, Organizational Culture and Leadership, 2nd ed. (San Francisco: Jossey Bass, 1997), 19–21. For the U.S. Army Warrior’s Ethos, see; for the Soldier’s Creed, see (accessed 4 August 2006).
  55. Chris Argyris, Strategy, Change, and Defensive Routines (Marshfield, MA: Pitman, 1985).
  56. Argyris and Schön, Theory in Practice.
  57. Chamu Sundaramurthy and Marianne Lewis, “Control and Collaboration: Paradoxes of Governance,” Academy of Management Review 28, no. 3 (2003): 397–415. The authors describe defensive routines that “denote cognitive, behavioral, and organizational responses that protect the ego, preventing actors from confronting the limits of current understandings and practices ... Studies depict cycles of self-fulfilling prophesies, downward spirals or strategic persistence due to the dysfunctional dynamics of defensive routines” (399).
  58. Ibid., 399.
  59. Ibid., 400.
  60. Chris Argyris, “Teaching Smart People How to Learn,” Harvard Business Review (May-June 1991): 99–109.
  61. Sundaramurthy and Lewis, "Control and Collaboration," 403.
  62. See one of the author’s related assertions in earlier works: Christopher R. Paparone, “The Deconstruction of Army Leadership,” Military Review 134, no. 1 (2004): 2–10; “Is Hope the Only Method? The Army’s Organization and Management Identity Crisis,” Military Review, no. 3 (2003): 47–55; “Piercing the Corporate Veil: OE and Army Transformation,” Military Review 131, no. 2 (2001): 78–82.
  63. In addition, U.S. Army doctrine employs no system to cite specific references in its publications; hence the old adage, “There is no plagiarism in the Army.” It is difficult if not impossible to deconstruct Army doctrine and make professional counterarguments by establishing the original sources of the knowledge. On the other hand, the U.S. Marine Corps (USMC) does a commendable job of including detailed citations in its doctrine (see USMC Publication Number 6, Command and Control , from which, e.g., U.S. Army FM 6-0, Mission Command: Command and Control of Army Forces , arguably plagiarizes in large, non-attributed segments). The U.S. Army is not helping professional reflection if its principal body of knowledge remains a source of faulty attribution (in the case of Army doctrine, inadequate attribution).
    If the professional body of knowledge is in a continuous state of flux and transformation, which we have contended in this essay, then the profession has to maintain an “audit trail” of sources of learning; otherwise, we are unanchoring shared meaning, and the profession will begin to unravel.
  64. Sundaramurthy and Lewis, "Control and Collaboration," 408. The authors also link the cure for this routine to promoting diversity and shared understandings, where professionals value both the trust and the conflict that are necessary for healthy collaboration.
  65. See the full text of Eisenhower’s 1960 speech online at (accessed 15 July 2006).
  66. Marta B. Calas, “Deconstructing Charismatic Leadership: Re-reading the Weber from the Darker Side,” Leadership Quarterly 4 (1993): 305–28.
  67. Sundaramurthy and Lewis, "Control and Collaboration," 404.
  68. Ibid., 405.
  69. Christopher R. Paparone, “U.S. Army Decision-making: Past, Present and Future,” Military Review 131, no. 4 (2001): 45–54.
  70. Ibid., 406.
  71. James H. Davis, David F. Schoorman, and Lex Donaldson, “Toward a Stewardship Theory of Management,” Academy of Management Review 22, no. 1 (1997): 20–47.
  72. Paraphrasing Weick, Sensemaking in Organizations, we include, in the definition of collaborative inquiry, the concept of sensemaking—a form of imagination characterized by using, modifying, rejecting, and creating new paradigms or mental models when dealing with situations of incoherency and disorderliness.
  73. Weick and Sutcliffe, Managing the Unexpected.
  74. Chris Argyris and Donald A. Schön, Organizational Learning: A Theory of Action Perspective (Reading, MA: Addison-Wesley, 1978). Unfortunately, the U.S. Army CGSC has professed the opposite consideration in Richard Paul and Linda Elder, The Miniature Guide to Critical Thinking Concepts and Tools, 4th ed. (The Foundation for Critical Thinking, 2004). Paul and Elder claim there are “Universal Intellectual Standards” (i.e., clarity, accuracy, precision, relevance, depth, breadth, logic, significance, and fairness) that “must be applied to thinking” (emphasis added, 7). If one accepts Kolb’s typology of knowledge, these standards would be absurd, especially during the divergent and accommodative stages where the opposites of these standards may reflect a more appropriate sense of reality.
  75. The latter, for example, includes the U.S. Army CGSC’s School of Advanced Military Studies (SAMS), which requires each student to publish original research in the form of a monograph.
  76. See, for example, Christopher R. Paparone, “If Planning is Everything, Maybe It’s Nothing: Why We Need to Deflate the ppb in PPBE,” Army Logistician, in press.
  77. For a treatise on this subject, which we consider a potentially dangerous practice, see Christopher R. Paparone and James A. Crupi, “Rubrics Cubed: Are We Prisoners of ORSA-style Decision-Making?” Defense Acquisition Review Journal (December 2005-March 2006): 420–55. There are always solutions looking for problems, and the impulse may be to grab them without realizing the problem has morphed and was never or no longer is connected to that solution. ORSA is the military’s acronym for operations research/systems analysis.
  78. For example, see the many remedies described by Sundaramurthy and Lewis, "Control and Collaboration."
  79. We paraphrase from personal correspondence with Col. (Ret.) Don M. Snider, PhD, professor, U.S. Military Academy, West Point, New York, 9 April 2003.
  80. Argyris, Strategy, Change, and Defensive Routines, 258–59.


Col. Christopher R. Paparone, PhD, U.S. Army, retired, is an associate professor in the Army Command and General Staff College’s Department of Logistics and Resource Operations at Fort Lee, Virginia. He holds a BA from the University of South Florida; master’s degrees from the Florida Institute of Technology, the U.S. Naval War College, and the Army War College; and a PhD in public administration from Pennsylvania State University. On active duty he served in various command and staff positions in the continental United States, Panama, Saudi Arabia, Germany, and Bosnia.

Col. George E. Reed, PhD, U.S. Army, retired, is an associate professor at the University of San Diego’s School of Leadership and Education Sciences. He holds a BS from Central Missouri State University, an MFS from George Washington University, and a PhD from Saint Louis University. As a military police officer, Reed served in a variety of command and staff positions. In his final assignment on active duty, he was the director of Command and Leadership Studies at the U.S. Army War College. Reed coined the well-known concept “toxic leadership” in an article of the same name in the July-August 2004 issue of Military Review.

October 2017