Category Archives: Risk

Risk management posts and discussion

Governance and the myth of the static

Here’s a word every college and high school student should learn: Governance. While it has that authoritative “govern”, it needs to be disarmed and understood as the instable farce it actually represents.

Much of my non-debating time in spent in a professional world known as “governance, risk and compliance.” While I try to limit the radiological exposure to the last term, the first and second comprise a lot of my interest and attention. To debaters who find epistemology and, in particular, meta-epistemology (which I define as the practice of creating meaning-and-interpretation production systems) interesting, this is a remarkably engaging place to work that is most likely not listed on your career survey listings. To give you a sort of aggressive explanation of governance and risk, let’s work with the idea that governance is what we do and what we get when we try to model out a system, based on our best estimates of how a system would seem to work well and keep itself well maintained, and risk is what you get and have to deal with when you ultimately fail at the former exercise. If you’re working for a Sandwich Artistry company, governance would be related to the effort to figure out what procedures and policies make sure you make good sandwiches, don’t give your customers food poisoning, and make sure you don’t over-compensate and cause the company to lose money and go out of business. Risk is the practice of dealing with how you might have gotten it wrong, either in what you did or didn’t realize might happen.

Notice I’ve neglected that third term: compliance. That’s the world of busy bodies with clipboards and checklists who check to make sure a governance approach that becomes promoted into the realm of sovereign interpretation (e.g. becomes a law or some sort of a regulatory requirement) is being practiced based on the interpretation of the clipboard police. While these people are vital to the functioning of systems, they tend to be arbitrarians who don’t understand the very nature of their existence. To them, the law is. It always has been. If you’ve seen the movie Pleasantville, these are the black and white types who are terrified of the ambiguity and complexity of color. They require certainty. The belief in rules assures that bad things will never, ever happen, just as long as all those deviant rule-breakers are punished and kept at bay. Compliance with rules is a very special thing, as it defines their sole purpose for collecting a paycheck. We subsequently find compliance professionals in the socially popular fields of speed enforcement, tax auditors and other folks who live within the mythology that the law is reality. While they might be dreadfully simple people, verging perhaps on the side of embracing totalizing ideas that gladly eradicate difference and exterminate those they can’t quite understand, we need these simple individuals when appropriately deployed to ensure that we architects of process haven’t made boneheaded assumptions that could crash the whole damn system.

Yet this presents a problem for us, especially for those of us who either deal with the creative act of governance construction, or work in the abstract “world of the gap” of systemic risk management. Professionally, we often struggle with our compliance peers as they take that which we constructed to be doxological truth (as if God passed the rules off to Moses and we are left to accept it without question). I’ll be the first to admit that many (most?) architects of process are guilty of inattention and distraction. Once something’s built, it’s no longer interesting. We need Ward Churchill’s “little Eichmanns” of compliance to monitor the heat of the engines we made, given the good chance that the whole damn thing will blow up if we didn’t get a little detail right. Or worse, reality changes on us (as it usually does). But who’s going to tell these nasty, anti-intellectual structuralists that the whole system has changed, let alone manage them? This is ugly business, indeed.

I first faced this “puzzle of the compliance structuralists” in 1999, when I was the head of service development for a mid-sized Latin America and Middle Eastern digital telecom startup. My boss, the chief operating officer of the company, would throw assignments at me that consisted of things like defining and constructing a new billing practice or a new network engineering practice out of thin air. Consulting with him on “strategic direction” (sort of a vision thing that you need to connect with to inform your approach), we’d make policies, procedures, standards, and other things that would construct the particular practice. Words became real.

A year after creating a billing practice, I encountered a problem. We’d hired a bunch of people from a former Regional Bell Operating Company (aka a phone monopoly) known then as US West (which became Qwest and, through the powers of other poststructural architects employed by capitalists in the realm of Hardt and Negri’s Capital, transcended to its current state of CenturyTel) and some of the mid-level managers were running in circles, all confused and unable to do their jobs. It turned out that something in the policy and procedure documents I had written the year prior was causing them serious grief, something unanticipated and quite normal as a company moves through supernormal growth and pushes even the best models you could create at the time. I recall joining the meeting in our large conference room and encountering “Joe” and “Marci” who were both exhausted with stress. They explained to me that they had run into a dreadful problem: they were trying to carry out some activity but it simply was impossible due to the fact that the policy prohibited it. They were absolutely stumped.

I responded “Well, it looks like we have to change the policy.” The reaction I got was akin to Moses saying “Well crap, it looks like that particular commandment sucks. Let’s toss it out and write a new one.” There was an implied sovereign diety implicated in each of the codified policies, according to Joe and Marcy. To change the law, or even question it, was an act of heresy. (Note: For those playing the home version of Radical Realism, the application of the potential to the real, this is a reason we study the problematic German philosopher Carl Schmitt in spite of all of his problems. Schmitt’s Political Theology, for instance, gives a remarkable accounting of how theological things we’d otherwise expect to be rational can be, such as governance and compliance processes).

I won’t go into the theories of why Joe and Marcy believed so faithfully in the “truth” of those policies (that’s an aspect of a lot of the theory I’m subsequently working on now), but I do want to share the realization from that conversation as it unfortunately seems to be consistently found across our various systems, practices and governance approaches. When I had the required humility to confess the failure of my best effort in constructing a particular aspect of the policy (specifically, a “policy control” I had engineered to attempt to keep some specific bad things we were worried about at the time from happening, and subsequently prevented a process from evolving through stresses that temporarily pushed that control space), I discovered I had two colleagues who felt as if they had just seen the Wizard behind the screen. They saw the glimpse of the fiction of governance, being told that this Sovereign Law they believed limited their very existence and practice was nothing more than a fiction that had become real. A mere hyperstition I got wrong.

Given the willing admission and confession that I blew it when my boss and I made that policy control, we quickly moved on and made the company better. But curiously, many controls and governance specifications we encounter in society are guarded by lesser creatures: incompetent policemen who know nothing of the originary fiction of the control’s half-assed narration. They’re the bureaucratic frauds who have assumed the mantel of a practice they know nothing respective to its original purpose, subsequently doxologizing the routine they inherited from their predecessors. Accidental movements become ritual: an incidental, accidental action constructed in response to a singular specific becomes a theological doctrine, imposed with the power of Inquisitional Authority by those who have an utter lack of comprehension of the actual purpose of the initial need.

The conclusion I’d suggest is this: every governance artifact, every rule, law, code, bylaw or expectation, is a consequence of someone else’s past. It might have been useful to them in their negotiation of reality in their time, but there is absolutely no certainty that it matters to yours. In fact, it may kill you, or make you seriously sick. In this world, you can’t coast. You can’t defer your responsibility for questioning the reality you’re confronted with and doing your best to build a model that seems to help you survive it. Failing to think, and assuming you live in a static universe where prior experiences predict the future, only ensures you will have an exceptionally painful and quite possibly fatal experience in a universe indifferent to the general laziness and incompetence of universalizing humans.

Think, engage, model and adapt. And never, ever assume the map handed down to you by a prior generation will get you through life’s minefield.

listening to the subaltern: concluding the jan-feb LD topic

With only a few weeks remaining on the national high school debate topic, “Resolved: It is morally permissible for victims to use deadly force as a deliberate response to repeated domestic violence.”, I wanted to conclude with a couple of observations based on comments and questions encountered with the cases my team has shared with the debate circuit.

We’ve run several affirmatives, such as our appropriation of Derrida’s “Beast and the Sovereign” that works through one of his last published works, engaging allegory, fable and other forms of storytelling to make serious questions about the integrity of the social contract so many accept without any question. We’ve engaged Levinas and Zizek in an examination of the site of ethical and moral construction, challenging the naive assumption that all individuals (or really any, for that matter) have access to universal notions for the pre-determination of the morality of a contemplated and premeditated act.

We’ve also engaged negative positions that, to confess, have been deconstructive rather than “truth testing” (a word we had to bend to have some chance of engaging the critical thinking of many who were misled in their education toward believing their metaphysics were unquestionable). By unconcealing and illuminating the centerings implicit in the resolution’s words “deliberate” and “permissible,” we brought attention to potentially undesirable and problematic epistemological and cultural packages that travel along with these signifiers.

What is common to each of the cases is that they tend to challenge the ~framework (let’s refer to this framing/enframing with the tilda to distance it from the commonly used term of framework that refers to a type of debate argument often found in policy rounds between critical and policymaker paradigms). Returning to LD after more than twenty years, I was shocked to discover a proto-religion had taken hold of the event I had loved and excelled within in high school. A new metaphysic of “Value & Criterion” had attained imperial authority and commanded over the interpretation of many a judge and debater, a doxology that emerged well after I had departed the mid-1980s high school LD scene. Some rather suspicious characters were found within this realm: Hobbes, Locke, Kant, Hume and a few others. While of certain historical philosophical interest, they commanded a presence within this frozen snowglobe of an event, delimiting that which was legitimate and acceptable to the sovereign form.

What was most curious was the rigor of its structuralist approach; I often felt as if I was back in undergraduate music theory being told that “all composers use figured bass notation and then a formula of rules they depart from in order to compose their music”… a claim that’s quite difficult to reconcile with the existence of the music of Charles Ives, John Cage, Witold Lutoslawski and numerous other composers I personally identified with. Realizing the significant pedagogical and epistemological harm that was being inscribed upon our debate community, we worked through elements of resonance and dissonance to introduce arguments that, as Deleuze was likened to say of his philosophical adversaries, they would recognize immediately as their own yet be horrified when they realized the extent of the mutation.

I’m pretty sure from some of the judge reactions we’ve had that we’ve been successful in this effect.

Subsequently, the general “theme” of our argumentation has been one where we’ve attempted to approximate a formalist construction (meaning “trying to make our argument look and feel somewhat similar to something the LD structuralists would recognize as ‘valid’, without utilizing their form; think of the proto-Ridleys in Alien Resurrection as a not-but-becoming-abject form approach). We’ve also attempted to create substantial instability in the framing itself (think of the black cat that walks past twice in the first Matrix movie, or the warping and bending of fields under stress). Stressing words in the resolution, shifting the intensity of the resolutional world from visual to infrared or ultraviolet, or moving the ontoepistemological center of the ethical-interpretive event (e.g. where is the site of the reading of the act, to which it can be determined to be ethical or moral) have been aspects of this ~framework stressing and bending, deterritorializing and reterritorializing. To those who have felt the worldview shift in the round, this is very much intended.

Behind this work across strata, there are a few aspects within the approach that merit clarification. I’ll address these around some of the recurring questions we’ve gotten:

Isn’t your case being parametric?
This was a puzzling question for me at first, as Jay can attest from my difficulty at first explaining why it is and isn’t. Akin to Derrida’s explanation that rather than claim the written word was not suspect (to the claim that the spoken is more true and pure than the written), Derrida’s analysis in Of Grammatology is instead to illustrate how both written and spoken word are troubled. Parametrics can only exist when one assumes there can be a totalizing universal that can then be sliced down into a parametric. On this level of believing in universalizations of the debate resolution, this is an assumption that goes unchallenged by most within the current structuralist LD paradigm, but by no means is an appropriate assumption to hold unquestioned. To argue that there are no universals and universal cases, however, is the easy and less-than-interesting answer. The more interesting discussion is the deconstructive analysis, which suggests that instead, ALL LD cases are parametric. To run a Kant affirmative on the domestic abuse topic is to reject 2300 years of Western philosophical tradition, save for a very tiny portion of the Enlightenment epoch, and construct a normative “worldview” from this myopic framing. This parametrization is even more violent when it is considered that we haven’t even included substantial bases of Indian, Chinese and other non-“Western” philosophical traditions (We enjoy debating Spivak and Sloterdijk for many reasons, one of which is the occasional reference to the Mahabharata and other epics).

When decentering from the epistemologies of certain dead white-male European traditions, it can be understood that a case based on Hobbes, Locke and Kant is exceptionally parametric. Our construction of ethics from the perspective of a viciously abused child is also parametric; in fact, one would suggest that from the perspective of Merleau-Ponty’s phenomenology and Heidegger’s ontology, we’re going to have great difficulty constructing any reading of the topic through a case that is not parametric. Whether we’re intentioning through the construction of a young child or an dead philosopher, both texts attain precise phenomenological and epistemological coordinates, narrowing the interpretation through a parametric lens. Rather than claim the illegitimacy of the pervasive and nonunique parametric, the more intelligent question becomes one of comparative coordinates.

I’d briefly note that we could even have some etymological fun in examining the baggage that comes along with the word “parametric”, para: along-side, beside, beyond-or-past (paraphysics), by extension; metric: pertaining to a meter, measured. Of a certain ghost-like presence that isn’t before us, but a specter stepping along-side us in a haunting, taunting manner, applying measurement and scrutiny. When in the company of such judgmental spirits, I’m certain many a devious LD structuralist would rather avoid having their imperialist baggage inspected, their sins accounted.

You’re breaking rules by making us defend the entire resolution
This was a recent and most unfortunate interpretation. One of the core components of counterplan theory in contemporary argumentation and debate theory is the concept of a PIC, or plan-inclusive counterplan. In this argumentative analysis, a word of the resolution is tested in order to stress it and force its advocacy or defense. This is a vital test in both debate and poststructuralism; in the former, a problematic word may be covered over and its debate avoided, with assumptions generally accepted by the debaters often on both sides. In the latter, it’s often the grounds for exploring problematic centerings, biases and cultural-epistemological skews.

Consider a real world application: a new policy is being proposed by the ruling demographic that makes clear sense to those intending it to do good will. But due to a difference in cultural interpretation, a word in the policy may be interpreted differently and have devastating results when it is implemented upon a broader demographic. Would we not want to test certain words, especially when these words have had a history to leading to systemic discrimination, exploitation, and even subjugation, violence and genocide? This becomes even more vital when we’re discussing norms; the cultural interpretation of a word for the construction of a social value can hardly be considered universal in interpretation as the mere historical construction of a word’s meaning is intricately connected to the historico-social production that shaped its rough, resonant form. Words like “permissibility,” for instance, may seem neutral to dominant majorities but when encountered by a minority, is consistently associated with a sovereign who “insists on being the one who gets to permit.” Permissibility, obtaining a permit, isn’t a conscious concern for those who have their papers in order and have paid the sovereign’s fee. Words like “permissible” and “deliberate” are far from neutral and demand careful questioning, particularly when issues of ethics and morality are of our concern.

Once again, the more interesting exploration of the “rule breaking” question is not in the linear defense to the problematic charge, but rather how the charge is impotent while also pertaining to all negative cases debated. As each affirmative constructs an ontoepistemological world (a world conceptualized from a specific ontological coordinate within a framed manifold of epistemological potentiality), each world is an advocacy of a “truth” (or “meaningfully explanatory and predictive resonance of sufficient signal and limited noise”). Each word in the resolution is traced in a highly specific way – either intentionally or not – enveloping certain signifieds within its border while willfully leaving other signifieds out. Consider the resolution:

Resolved: All human beings should be protected by the state from death.

Should an affirmative be questioned on their advocacy of this resolution if their trace of the “human beings” signifier intentionally leaves out “those deemed not human”? Daniel Goldhagen’s important 1997 work Hitler’s Willing Executioners: Ordinary Germans and the Holocaust addresses this very tracing and suggests the definition of Jewish individuals as “rats, vermin, not humans,” as evidenced throughout generations of anti-Semitic German literature, may have profoundly contributed to the societal “moral permissibility” of allowing millions of Jews, Roma, and other peoples to be sent to their death. Is requiring the Affirmative to defend the evidence of a problematic trace fair grounds? Or rule breaking? And if it’s the latter, to quote a remarkable debate coach and friend Dana Christensen, wouldn’t we have an obligation to challenge and break these rules? I’d suggest there may be a more important debate regarding moral obligations underpinning the framing of debate rules if that were indeed the case.

Aren’t you cheating by not debating the resolution?
This infrequent question has perhaps been the source of greater disappointment than the rest. Hearing a few judges (not the majority, fortunately) repeatedly claim that “I think you’re doing shady things by debating these cases that really aren’t about the resolution” shows the failure of our academic project in debate more than anything. Consider that on the topic of women being domestically abused, raped and murdered, some in our LD world would immediately run to a handful of very dead, old white guys to serve as authorities over what is and isn’t moral. This observation was the very grounds for one of our cases, as we are quite certain that the first thing that pops into the mind of a woman or child about to be violently beaten and even possibly killed, is not: “What would Immanuel Kant say?”

Doesn’t this racing for the “Good Book” of the Enlightenment when faced with a contemporary crisis suggest the slightest bit for concern? Are we really that disconnected and insensitive to an other’s plight? Is this a snow globe of privilege and pedigree? If we have vital projects like Women Under Siege and remarkable artistic, poetic expressions from subaltern women through movements such as the RAWA in Afghanistan, how could one feel comfortable silencing these voices in our experience? Should LD become a “Dead White European Males Only” zone?

I’m certainly understanding of the remarkable contribution this narrow pedigree of thinkers provided for us; Deleuze, Spivak and Derrida (three of my favorite thinkers) repeatedly acknowledge their tributes to the work that they had done, but remarkably, continue the evolution of thought forward. The real statement made in that question is not an accusation but rather a defense: Why are you encouraging thinking in our debate experience when I was told that rote skills, memorization, and drills were all that was needed to succeed. Why are you intending to threaten my legitimacy by rendering me obsolete?

To this final charge, I am indeed guilty. As one aligned profoundly with Ranciére’s pedagogical project, and working intimately in the world of systemic risk, I am firmly convinced that unless we advance our systems of learning to encourage and engage our young people with critical and creative thought, there is sufficient reason to doubt humanity’s long-term survival. The problems we face today are ones that memorization and rote drills cannot solve, nor “depth over breadth” approaches to a silo’ed education. Inter-and-multidisciplinary approaches, crossings of borders, infusions of poetry and aesthetics into science and literature, and continued engagements and challenges of the assumed frameworks, are necessary for our future generations to have a chance at resolving the problems that contemporary thought cannot reconcile.

On Kittler and the Autopoietic Integration of Identity Data into the Post-Foucault Assemblage Archive

Jamie Saker
European Graduate School, June 2011
Creative Commons License
On Kittler and the Autopoietic Integration of Identity Data into the Post-Foucault Assemblage Archive by James R. Saker Jr. is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Based on a work at
Permissions beyond the scope of this license may be available at

With the emergence and acceleration of second generation “Assemblage Archives”, heterogeneous, second-order databases of identity, constructed through the linkage and integration of first-order homogenous collections of individual behavior, the problem of the development and evolution of extrinsic and/or intrinsic normative controls at the second-order level appear to exceed the capacity for private and public control.

In his work Gramophone ,Film, Typewriter, media theorist Friedrich Kittler writes of the connection of the emerging digital data sets to the archive, to which theorist and historian Michel Foucault had substantiated provides for the source of power:

History was the homogenous field which, as a subject in school curricula, included only cultures with written language. Mouths and graphisms dropped out into prehistory. Otherwise events and their stories could not have been connected. The commands and judgments, the announcements and prescriptions that gave rise to mountains of corpses – military and juridical, religious and medical – all went through the same channel that held the monopoly on the descriptions of these mountains of corpses. That is why anything that ever happened ended up in libraries. And Foucault, the last historian or the first archeologist, had only to look it up. The suspicion that all power comes from archives to which it returns could be brilliantly illustrated, at least within the legal, medical, and theological fields.
(Friedrich Kittler; Dorothea von Mucke, Phillipe L. Similon. “Gramophone, Film, Typewriter.” October, Vol. 41 (Summer 1987), pp. 101-118.)

In the two decades following Kittler’s analysis and connectivity to the archive, the realm of digital commerce and social engagement, as particularly but not exclusively constructed on the Internet, has seen evolution of first generation systems arise in correspondence with the nexus of social engagement. Such assemblages of digital history tend to center around the individual’s engagement with specific and subsequently local regions of social experience: driving histories recorded with the Department of Motor Vehicles, merchant purchases captured at the point-of-sale terminal, course and grade transcripts archived at the school and university.

Each first-generation digital archive experienced its construction of capabilities, practices, processes and norms through their initial closures, provided through the initial closures that defined systemic control of the archive, and from the subsequent emergence of capabilities, processes, norms and other behaviors that followed given the definition of the archive through its intrinsic and extrinsic engagement with social, political and economic actors.

In the second major generation of archive construction, entities that include Google, Facebook, Twitter and others have shifted from the development of homogenous archives centered around a locality of social experience toward the creation of second-order archives, constructed typically through the linkage of social locales through the commonality of the individual. As Heinz von Foerster identifies in his 1993 lecture, this integration of first-order systems causes the question of the rules of integration for the second-order archive to be raised:

I have a System A, I have a System B, and now I’d like to integrate both of these into a System C. What rules consist of that allow a new System C to arise, rules of integration, of composition?
(Heinz von Foerster, “For Niklas Luhmann: How Recursive is Communication??”. Lecture given at the Author’s Colloquium in honor of Niklas Luhmann on February 5, 1993 at the Center for Interdisciplinary Research, Bielefeld. The German version was published in Teoria Soziobiologica, 2/93. Franco Angeli, Milan, pp 61-88 (1993))

According to German systems theorist Niklas Luhmann’s theory of autopoietic closure and control, these first-order archives described by Kittler realized regulations, norms, practices and processes through their engagement within the actors and participants of the homogeneous practice. Actors within the first-order had close proximity to its practice, experientially understood its attributes, requirements, risks, threats and norms. Recurring and frequent interaction by the actors within the first-order provided for the evolution of responsible norms, policies and controls.

Architects, administrators and archivists in the engagement with the first-order “Archive of the Motor Vehicle Driver”, for example, would have had close proximity with the Department of Motor Vehicles, Federal, State and Local auditors, political and citizen-led feedback, and other agents with substantial subject-matter experience to the locality of the first-order archive. As such, the architecture, definition, development and maintenance of these initial digital archives was conducted in close proximity to its stakeholders and realized pragmatic normative practices through this proximity.

Given the premise of the accelerated emergence of second-order Assemblage Archives (or “System C’s” to approximate Foerster’s model), where the individual is no longer defined in relation to a specific field of practice or locality of engagement, but rather through and across the multiplicities of first-order archives in the construction of a second-order archive, and given the extra-jurisdictional detachment this second-order archive realizes through its disconnection from the nexus of practice and actor experience, what are the anticipated consequences and corresponding responsibilities societies have in ethically managing this second-order assemblage?

Black Hole Theory of Fascism

It might appear that the primary determinant for the bifurcation of a State into a fascist form is not what some theorists might suggest — from populations having “natural inclinations” or other compelling externalities driving the people and/or state toward fascism, societal demands for strong order in times of chaos, or other “outside the system” root causes – but rather a function that has a fractal equivalence to solar lifecycles.

Simply put, most stars lack sufficient mass (and more appropriately, velocity and aggressive consumption of excess) necessary for the conditions to create a black hole. As such, they end up in modest post-modern form: dull brown dwarfs of mediocrity. But give a star sufficient resource mass, propel it through velocity into a pattern of voracious consumption and ejection of surplus, and it reaches a problematic when the unreconcilability of its irrationally unsustainable trajectory is uncovered. Black holes don’t go into 12-step coping programs; they don’t acknowledge the inevitable. They deny, resist, assault, reject and oppose; they go down shooting, and “take all they can with them to hell.” They lash out, beyond Freud’s Melancholia, clinging to fundamentalist metaphysic, nihilism, or worse, a combination of the two in the construction of a death-centered fascist State.

Germany realized this acceleration-toward-singularity in its aggressive radicalization beyond the collapse of meaning inherent in the implosion of the Weimar Republic. Italy, Franco’s Spain, Lenin’s utopian metaphysic perverted by Stalin’s paranoid manufacturing race of excess, all have demonstrated the capacity for an the creation of singularities when national/cultural meaning is exhausted, unlike the modest collapse and irrelevance of other nation-states. One wonders what outcome awaits the imminent exhaustion of not one, but two predominant empires when meaning escapes the populace of the U.S. and China.

The Invisibility of the Slow Motion Tsunami

A slow motion tsunami creeps toward the south, Cloaked in stealth before the hyperactive news media of the telosphere, it’s cascading 20m high walls of water, doubling the expression of Japan’s quake-initiated terror. Can slow speeds render an existential threat invisible before the dromosphere?

A Pragmatic Justification for Competitive Academic Debate

© 2010. James R. Saker Jr..
email: noise -at- thirdparasite . com
Distributed under Creative Commons License. Attribution Non-Commercial. Cc-by-nc.

With new debate programs under consideration in the Nebraska community, I wanted to share my support through the illustration of the value of the programs to the education and practice of corporate governance. The usual justifications of debate program proposals often advance the value of debate in expanding the student’s capacity for the development of argumentation theory, persuasive advocacy, analytical reasoning and research skills. While I certainly don’t mean to diminish these invaluable skills and concur with the accuracy of their identification with the competitive activity, I wish to highlight a less frequently identified aspect of debate which is both more relevant to the pragmatic “business orientation” of today’s secondary and post-secondary academic programs.

As a governance, risk and compliance (GRC) professional responsible for enterprise and technology risk management for a global financial processor, I coach, judge and advocate high school debate because of my recognition of the activity’s role in advancing the exposure and familiarity with critical thinking skills. Many on our regional circuit know me as one of the more “critical judges” (which derives from the critical theory label) which tends to perplex those who would expect a corporate type who debated during the 1980s to be of the traditional “policymaker” paradigm.

In my role as architect and leader for our risk management initiatives, I’m challenged with the responsibility of transforming the interpretation, integration and application of “risk thinking” across our global enterprise. Applying the emerging theoretical expertise from critical and post-structural theory, I’ve increasingly valued the pedagogical capacity of debate’s “critical thinking” aspect in the development of future young professionals who will possess the conceptual skills necessary in addressing a whole new set of problems. Globalization, busting economic bubbles, major global demographic shifts, the end of consumer anonymity and the failure of the regulatory model in preventing systemic market risk are all challenges our future professionals will immediately face in their careers. Yet little of our structured academic program engages our students with the awareness of the emerging systems of thought that provide opportunity for an answer to these new challenges.

While the core academic program will continue to produce candidates capable of functioning within the current system, critical debate provides a rare framework which delivers our future architects for advancement of our businesses, institutions and society. With the proposals for college CEDA/NDT debate at the University of Nebraska at Omaha, and considerations of program additions at Nebraska and Iowa school districts, I wanted to share a sampling of the application of this critical field of thought and illustrate some of the context and relevance our students experience through the practice of competitive debate. The following four issue areas represent a specific aspect of governance, risk and compliance, where a student with critical debate experience would obtain familiarity with concepts prevalent in the debate realm.


Do systems of corporate governance suffer from dynamics that facilitate the permanent erosion of authorized policy and procedure? Do our corporations engage in the autopoietic generation of quasi governance systems to a state where corporate policy is permanently extended and controls eroded? What is the impact to enterprise risk when the policy controls of the institution are systemically excepted?

Debaters familiar with Italian post-structural philosopher Georgio Agamben’s work are introduced to vital concepts that illuminate tendencies in governance systems to expand to the extra-legal state, often through the mechanism of exception. Although originally intended for the evaluation of political States, Agamben’s State of Exception provides exceptional theoretical foundation for this corporate governance analysis and the emergence of the existential threat from the expansion of the condition of exception to authorized corporate policy. Both descriptive and prescriptive, Agamben’s model is useful in assessing and counter-acting the erosion in policy controls within institutions.


Why do efforts to protect the economy and its consumers from systemic risk fail? Is this failure – illustrated by market meltdowns, exposures of systemic breaches in corporate ethics, financial accounting fraud or product safety nightmares – caused by a shortage of regulations and regulators? Or can it be blamed on the deficiencies in audit and accounting methods and procedures? Or is each failure a unique, unforeseeable occurrence society is unable to predict or prevent? Or is there perhaps a structural flaw that is inherent in our systems we don’t yet understand?

Debaters researching the 2010-2011 high school debate resolution will almost certainly encounter Hardt and Negri’s vitally relevant text, Empire, encountering an application of German sociologist Niklas Luhmann’s social systems theory. While Hardt and Negri evaluate the applied theoretical landscape of hegemony, imperialism and post-modern constructs of statehood, Luhmann’s underpinning model provides exceptional descriptive and predictive capacity for anticipating the failure of a regulator or auditor evaluating an assessed entity. Problems of second-order communication are of particular concern, and should Luhmann’s model be correct, neither more regulation nor additional auditors engaging through current second-order practices will have any meaningful change in the realization of unanticipated systemic risk and impact. Alternative methodologies will need to be developed and employed in order to respond to this phenomenon.


What is the role of the corporation in the safeguarding of consumer information? Should personally identifiable information, including consumer preferences, medical records, credit histories and other attributes that identify the behavior and orientation of a specific consumer, be further utilized for the advancement of product customization and the enhancement of the “consumer experience”? Are there risks in the aggregation of disparate consumer information sources that could incur reputational risk from consumer backlash? And through what framework – legal, ethical, moral, social or other – should the corporation evaluate this capability and commensurate risk?

Continental French philosopher Michel Foucault, who’s examination of the application of biopower in social service systems was predominant in the 2009-2010 high school policy debate season, evaluates at length the concerns of a surveillance system. Foucault, and numerous others who have followed his analysis, provides debaters with an invaluable orientation into the risks of surveillance and the role of authority in social systems. Debaters who become familiar with the works of Foucault and followers will likely possess greater understanding of the inherent perceptual risk associated with systems of surveillance, particularly as applied to the digital marketplace.


How is corporate transformation facilitated? Why do corporate cultures tend to decay to a state of regimented, stagnant, silo-structural dominance? Why do extensive hierarchies and vertical organizations tend to struggle in the globalizing marketplace, particularly when we’ve been led to believe that stronger hierarchies with increased central power are the solutions for problems in our economy as pronounced through a growing central government? If expansive hierarchies are not the solution to corporate and economic transformation, what are alternatives and how are they employed?

Philosopher Gilles Deleuze and sociologist Félix Guattari, predominant in advanced critical debate circles, provide ground-breaking theories on the relationship between hierarchical and decentralized structures. D&G’s analysis of rhizomic (decentralized) vs. arboreal (hierarchical) systems, the behavior of cultural territories, the conceptualization and application of multiplicities, and numerous other concepts are significant in addressing organizational transformation. Other predominant critical debate authors, including Continental philosopher Jacques Derrida, who is notable for his development of deconstruction as an approach for identifying and moving beyond systems of binary conceptualization, and Slovoj Žižek, notable for criticisms of capital particularly in a realm of increased globalization and interdependency, provide ample ground for the discovery of new methodologies for the re-engineering and ethical transformation of institutional process, program and enterprise.


Competitive debate, particularly oriented around the critical analysis of our society, systems, institutions, policies and cultures, provides an unparalleled educational experience for our students in the introduction and advancement of critical thinking skills and concepts.

Acknowledgment: I’d like to recognize a notable University of Nebraska at Omaha leader who inspired me in my practice of high school debate and encouraged me to look deeper in questioning the rules and norms of the institution, recalling deconstructive opportunities inherent in paths left unexplored at earlier forks in the road. To Dr. Otto Bauer, retired UNOmaha vice chancellor, published debate theorist, Air Force Academy debate coach and Northwestern University debater, I thank you the encouragement you gave to the many generations of debaters you reached.

Post-Structural Judging Paradigms

A work in progress that I’ve been remiss in communicating is my effort on communicating a post-structural interpretation of policy debate. I’ve had many discussions through post-round criticism and out-of-round discussion regarding this interpretation but as a serious work-in-progress, it’s expression has been notably absent.

Having squirreled three times this year (once in policy, twice in LD), I’ve paid careful attention to the panel presented and the nature of the squirrel. I’ve felt that this analysis would be beneficial in the representation and communication of my paradigm; something any person seriously concerned about the pedagogy of debate would be mindful of. This weekend’s squirrel at Westside (Nebraska) in novice policy semifinals was for a Barstow team, against a Millard West team I had just previously voted up in quarterfinals. The squirrel and its post-round discussion by all three judges illuminated the source of difference. As Deleuze would say, it was an intensive difference, not extensive difference.

Beyond Stable Meaning
A clear difference in my approach to the round is in the stability of meaning. Derrida illuminated serious problems in post-Socratic meaning through his deconstruction of Saussure, Levi-Strauss, Rousseau and others who embraced logocentrism (spoken word as true and the written as shady and questionable). Curiously, logocentrism continues to be a dynamic in the round, but an even greater question of hermeneutics (theory of interpretation) comes into play in many rounds where it is assumed that Enlightenment’s scientific approach to reason will unveil the absolute and error-free truth of the warrant. It is this Enlightenment hermeneutic that I’m terribly at odds with and tends to present the clinamen for the squirrel (clinamen is represented here as the least deviation from the laminar flow that gives rise to the vortex of a different order — a new conclusion and an independent outcome, as approximated from Michel Serres’s interpretation of Lucretius as expressed in Genesis, The Natural Contract, and Angels: A Modern Myth).

In my interpretation of the warrant, I’m fully comfortable with warrants being signified and re-signified through a system of volatile and dynamic meaning. Stable meaning is a myth: walk in the shoes of a risk manager in a global corporation and you’ll experience first-hand the extreme uncertainties of “Things that are Stable and Not Risky” go radical on you and give clinametic rise to the “unexpected” vortex of the existential-risk black swan. Meaning is shifting, unstable, uncertain and filled with noise.

Subsequently, I approach the presentation of warrants (evidence) as within this system of instability. I find it relevant to apply the claim signified to the warrant, especially given that the claim is usually made at reasonable speeds and the warrant delivered at a velocity far beyond. I find this to be pedagogically consistent, given that in the executive decision-maker climate, most warrants are rarely questioned and claims accepted unless there is reason to question the provider of the warrant. Senior executives tend to rely on the claim presented (and expect tacit representation of warrant) given that the ethical construct is one based predominantly on trust; get the warrant wrong and you usually won’t be along very long.

The Hermeneutic Experienced
An instantiation of this difference follows: Imagine an critical affirmative 2AC presents Berube’s wonderful 1997 warrant that claims that deontological impacts must be evaluated first (that delicious “Five horsemen of the apocalypse” card by former USC debate coach and CEDA theory guru Dr. David Berube). But imagine further that the affirmative claims the card says that “Beer solves poverty.” Now certainly my judging panel of policymakers and college-attending judges will recognize this card and find that claim to be absurd. As soon as the Negative puts the most minor of offenses on the flow (say, referring to their 1NC utilitarianism card and giving a warrantless claim that it’s a better card, and leaving it that), my fellow judges will gladly leap into Interventionland, determining the “beer solves poverty” claim to be fictional and failing to access the “real truth” that’s within the Berube card.

I contest that this is blatant, un-creative hermeneutics. First of all, to make this leap to a “fixed truth of the Berube card” destroys the potentiality of Derrida and deconstructionism. It goes well beyond that, in fact, denying the capacity for the counter-read. It deprives the 2AC from uncovering a new approach to meaning of the Berube evidence. But furthermore, it’s exceptionally interventionist and, I dare say, disqualifies a judge who makes this leap from claiming to be tabula rasa (on this note, I’d suggest that such paradigms are mythical in a system of fluid, unstable meaning).

Instead, I see the 2AC resignification of the Berube as an establishment of a temporary point of order — a footprint (Michel Serres) to which the affirmative seeks to instantiate a new local order of meaning. When presented, I see the least interventionist hermeneutic model as one that accepts this resignification until there is counter-advocacy sufficient to disrupt and re-re-signify.

The Hermeneutic Applied
So how does a team apply and integrate this hermeneutic? First of all, as I explain to any team approaching a panel, I strongly recommend playing the numbers. I’m a game player as are many, and advise teams to work the numbers. If two of the judges are college age policymakers and I’m the questionable third (as Michel Serres would proudly associate), play to the majority. But should you encounter my paradigm in the solitary, or have the capacity to integrate it with your advocacy to the judging panel, the following method may be beneficial.

Consider our Berube scenario: the 2AC has given us a counter-read (regardless of whether this was innovative, shady or outright confused). The Negative should recognize that I will accept the new system of signification absent a reasonable challenge. But what constitutes that challenge? Certainly not a 5-second analytic saying “The 2AC gives their Berube 97 but our 1NC Smithee ’08 is better.” Instead, draw out the Berube, re-re-signify it, or contrast it with the Smithee ’08. Best of all, contrast it, provide me a hermeneutic micro-framework of how to interpret the decision between the two cards, and call for the cards to be read through that framework at the end of the round should they be material to the ballot. I find this approach to be least interventionist in a system of unstable meaning and find great comfort evaluating the competing hermeneutic framework debate (at which even I conclude that we have to “fish or cut bait” and determine a temporary moment of stability through the hermeneutic framework in order to derive a ballot). I typically find that framework through the process of locating resonance-of-meaning and finding that resonance as a signification of a temporary order capable of offering and sustaining the hermeneutic model. This resonance model itself is something that merits further elaboration and exploration in a future post.