You are currently browsing the monthly archive for July 2009.

Krepinevich has a pertinent point: rocket, artillery, mortar and missile (RAMM) capabilities are proliferating rapidly, and guided RAMMs (G-RAMMs) are increasingly available.  The later ‘do not require a high degree of operator training’ (p24).  The range of such weapons extend out to 50 miles/80 km, compared to the 4 mile/7 km radius of Vietnam-era mortars.

How well then does the inkblot approach to counterinsurgency work, when the insurgents can attack bases from such a range?

References

Krepinevich, A. F. (2009). “The Pentagon’s Wasting Assets.” Foreign Affairs 88(4): 18-33.

Advertisements

Three pieces recently caught my attention:

They’re short pieces, so I recommend reading each.  But together they point to more deeper systemic change—and here I’m pushing further some of the points raised by Tom Mahnken in particular.  I’ll make a start here on some of those issues, and add to them over the next few days.

First, like the printing press, present-day information technologies have weakened traditional state structures and processes.  Take the military itself.  That sacrosanct element of the modern, western conventional armed forces, the command structure, is being challenged by its senior-most echelons.  True, senior officers could always reach down to direct junior underlings.  But now information technologies have much enhanced the ability of senior commanders to reach down, real-time, around the slower chain of command to the point of attention.  Consider comments by the-then CDF, Peter Cosgrove, in 2003:

For me, the first two hours of a relatively long day were spent poring over the website reading the various reports, following up on them by email, by telephone and face-to-face.

………..

Our Special forces could send us data including images from enemy territory. We could send them, from any level of command, anything from military orders to the rugby scores.

That reach-down can have an erosive effect on confidence within the chain of command.    Because they can, every issue–most often tactical matters–becomes worthy of the attention of the chief of service or defence force.

And it has a further consequence.  Modern technologies allow generals to relive their days as lieutenants and captains in the field without the attendant dangers.  They risk falling into the trap of addressing the problem they felt they could solve—as they had before—rather than those they should attempt to solve (Dörner 1996).  And it reinforces the focus of attention on the tactical over the operational, let alone strategic.

The issues aren’t confined to the military, but affect governance and accountability.  In a certain worlds, focussing purely on the tactical, once setting the direction, can suffice to achieve good outcomes.  But that’s not the world we live in.  Our strategic environment is fluid, changing, and as it shifts and changes our interests, goals and the best means to achieve them also change.  We need a constant dialogue between the strategic, the operational and the tactical, and a much more adaptable approach.  That’s hard to achieve in a system that inherently assumes stasis and stability, promotes dated benchmarks, and seeks to enforce certainty through tightly coupling capability to a parsimonious strategic vision.


Software development, of course, brings its own challenges.  Software development is an inherently creative process, not conducive to Taylorist approaches or waterfall models of project management (Brooks, 1995).

One of the relationships that is changing as a result of technology is that between civilian oversight and the military, worthy of a point of its own.   One of Mahnken’s colleagues, Peter Feaver, along with Damon Coletta, wrote on the effect of information technologies on civil-military relations in 2006 (Coletta and Feaver 2006).  They describe how in Kosovo and Bosnia, General Clark was able to operate under the radar of civilian monitoring, facilitated by information technologies:

  • first, the coordination of multiple assets in different planning domains, not all of which were visible to the civilian establishment; and
  • second, shifting targeting away from fixed assets, on which civilians had focussed and to mobile, ground assets, exploiting the advantages of battlefield command and control technologies and the notion of the sphere of professionalism: ‘…Clark was able to import elements of [his] tactical philosophy to the strategic campaign.’ (p118)  That in turn resulted in the loosening of civilian oversight of some aspects of the campaign while tightening others.

So while in principle information technologies should enable improved oversight and monitoring of the military domain by civilians, they by no means guarantee such an outcome (Coletta and Feaver 2006, p120-1).  Instead, information technologies generate a dynamism that permit military agents to exploit the very flexibility civilian principles require in pursuing political ends–decision-makers cannot be absolutely rigid in their statements of objectives, but must leave room for manouevre, compromise and even opportunism.

Coletta and Feaver acknowledge a concern expressed by Singer: the intrusive nature of information technologies could erode military autonomy and so professionalism (p110).  But it need not take civilians to generate such an effect: arguably we are seeing it already as generals seek to second guess the tactical judgments and overrule the commands of their more junior officers in the field, as noted above.

The second effect is more insidious: the fluidity and bandwidth generated by information technologies effectively loosens civilian control.  It’s harder for civilian decision-makers to ensure the distance between themselves and the military, provide certainty as to aims and objectives, and to specify and enforce constraints.  And it’s hard for civilian advisers to gain sufficient familiarity with military systems in fast-moving environments to assist with that oversight.  More than ever it is up to the military to help ensure civilian knowledge and control of themselves and their mission.


References

Brooks, F. P. (1995). The Mythical Man-Month: Essays on Software Engineering. Reading, MA, Addison-Wesley Publishing Company.

Coletta, D. and P. D. Feaver (2006). “Civilian Monitoring of US Military Operations in the Information Age.” Armed Forces & Society 33(1): 106-126.

Dörner, D. (1996). The Logic of Failure. New York, Basic Books.

MIT’s Technology Review reports a prediction by Didier Sornette and colleagues (Bastiaensen, Cauwels et al. 2009) that a Chinese stock market collapse is imminent–due before 27 July, in fact.

Crashes in stock markets represent cases of self-organised criticality (see, for example, Turcotte 1999): like avalanches, pressure builds in the system to the point where overloading triggers a collapse.  We cannot predict exactly when a collapse will occur, where, or how large the collapse will be, but collapses are inevitable–and sometimes small collapses trigger much larger cascades.  The behaviour of such systems over time follows a scale law: large collapses are few; small collapses are many.

Examples of self-organised criticality can be found in a wide range of natural and social systems, including finance and war (Turcotte and Rundle 2002).  Can we apply the same ideas to nuclear proliferation?

For example, we can substitute the idea of nuclear latency–the level of capability that would allow a swift transition to nuclear status, including through indigenous civilian programs–for load.  (The analogous component in other systems would be combustible material for forest fires, tectonic stress for earthquakes, and over-investment in financial systems.)    The load builds to a point where breakout is inevitable.   But the characteristics of criticality apply: we don’t know when, or where, such a breakout will occur, or how large the ‘avalanche’ will be–one or two nations, for example, or a cascade of proliferation.

What triggers collapse in such a system?  It cannot be capability alone.  But proliferation comprises a combination of material, expertise, infrastructure and intent.  As underlying capability–material, infrastructure and expertise–grows, then intent becomes increasingly important in assessing proliferation risks and behaviour.

And intent necessarily becomes a function of expectation: what are the expected consequences; and what are actors’ expectations of each other?   As in the market, we lack perfect information.  The differences between intent, expectation and surety generate instabilities, which as the load increases and system stress increases, increase the likelihood of collapse.

Moreover, the longer stresses in the system build, the more likely the collapse will be large, cascading as nations with high latency succumb to pressure generated by uncertainties of over others’ intent.

Can we adopt Sornette’s ideas for predicting collapse?  Sornette looks for bubbles in market data; no similar information is available–as far as I’m aware–on nuclear material, industry, or skills.  It’s not exactly the most open of industries–and even more so where there is a covert intent to proliferate.  And even in market data, finding bubble-like behaviour does not necessarily translate into collapses.

But then, Sornette et al do not rely on data alone, but seek to find drivers of such behaviour.  From the Technology Review piece again:

The telltale sign of a bubble, he says, is a faster than exponential growth rate caused by a positive feedback mechanism that generates this nonlinear growth.

Within nuclear proliferation, such drivers include

  • protective hedging against Western conventional dominance, and increasingly, against regional competitors; and
  • increased means of gaining the material, expertise and equipment needed for proliferation, including through sub-national means such as the AQ Khan network.

From a systems perspective there exist drivers trending towards proliferation. Taking the pressure out of the system requires adjusting or defusing the drivers, such as increased transparency of programs; redirecting intent, such as through cooperative security and international regimes; or some sort of as yet unknown technological solution.  The international community has tried a number of these, but given the increasing latency, new and different means may be needed: the barriers suitable for small avalanches, for example, are unlikely to be able to hold back large avalanches.  And therein lies a further problem for the international community: the more the system is held back, and pressure/latency allowed to build rather than being diffused or bled out, the greater the likelihood of a large, cascading breakout.

References

Bastiaensen, K., P. Cauwels, et al. (2009). “The Chinese Equity Bubble: Ready to Burst.” arXiv: 0907.1827.

Turcotte, D. L. (1999). “Self-organized criticality.” Reports on Progress in Physics 62(10): 1377-1429.

Turcotte, D. L. and J. B. Rundle (2002). “Self-organized complexity in the physical, biological, and social sciences.” Proceedings of the National Academy of Sciences of the United States of America 99: 2463-2465.

The day I’m about to start posting again is the day my MacBook Pro’s video card decides to give out.  So we’re a little stressed at the moment.

July 2009
M T W T F S S
« Jun   Aug »
 12345
6789101112
13141516171819
20212223242526
2728293031  

Archives