Further to my last—I’m no advocate, by the way, of a technology silver bullet—I note that Nature has published an extract of C P Snow’s Science and Government.  Snow wanted ‘to disentangle how political decisions were made during the war and, importantly, how scientific advice was used to make them.’  In the extract, he retails the British Government’s decision to develop radar—an unproven technology in the mid-1930s, but essential to the British war effort.

In public, rebellious politicians like Churchill were attacking the whole of the government’s defence policy. In secret, the government scientists, the military staffs, the high officials, were beating round for some sort of defence. There was nothing accidental about this. It was predictable that England, more vulnerable to air attack than any major country, would spend more effort trying to keep bombers off. But there was something accidental and unpredictable in Tizard being given his head.

…..

[Tizard] succeeded, with the help of Blackett’s exceptional drive and insight, in beginning to teach one lesson each to the scientists and the military, lessons that Tizard and Blackett went on teaching for twenty years.

The lesson to the military was that you cannot run wars on gusts of emotion. You have to think scientifically about your own operations. This was the start of operational research, the development of which was Blackett’s major personal feat in the 1939–45 war. The lesson to the scientists was that the prerequisite of sound military advice is that the giver must convince himself that, if he were responsible for action, he would himself act so. It is a difficult lesson to learn. If it were learnt, the number of theoretical treatises on the future of war would be drastically reduced.

I’m a touch less sanguine regarding the last.  But it is true that the scientists need to understand their, er, clients better.  Part of the problem is that defence science is heavily directed—defence establishments are practically focussed and efficiency and effectiveness drives continue to insist on tight couplings between funding and outcomes.  

But scientific breakthroughs are most often unintended consequences of projects for other purposes, and sometimes what is known as ‘undirected’ research.  Radar, for example, was offered first as a means of avoiding ship collisions.  Fleming found a contaminant in one of his sample: penicillin mould.  Engineers tend to invent stuff because they like inventing stuff—and sometimes some of that stuff is useful, and often not as intended (think of the internet).  

In getting science and technology into the strategy and policy domain, we need people in that process who have sufficient nous about science and technology, and strategy, that they can see possibilities, understand error, are comfortable with unintended consequences, and, like Tizard, seize opportunities.

Advertisements