Monday 15 August 2011

WAR (hyuh, yeah)... what is it good for? Tech stocks


Any time I get into a conversation with somebody who is either interested in philosophy, or involved in technology, I somehow manage to steer conversation around to a basic problem I have, which is that technologists (Kevin Kelly, Ray Kurzweil, Barabasi, et al.) often seem to have an all-too-unambiguous relationship with the applications of their chosen field to warfare. This is a primarily ethical problem, but we can displace it into the political realm if we wish to remain dispassionate. I would like to push our conceptual apparatus even further (mainly to avoid my own rhetorical excesses), and to attempt an alternative view, namely to consider this problem in even more abstract terms.

Step back for a minute, and consider G.K. Chesterton's words on the detective novel:
keep in some sense before the mind the fact that civilization itself is the most sensational of departures and the most romantic of rebellions … It is based on the fact that morality is the most dark and daring of conspiracies. (from “A Defence of Detective Stories”)
When I reflect on this, it leads me to note the quite peculiar status of nature's relationship with all things man-made. There is, however, no simple good versus evil opposition. There is no reason why we should consider nature to be more beneficial to us than something constructed or artificial. The ebola virus comes from nature, hospitals are constructs – a crude pair of examples to be sure, but consider this as an antidote to any opposing crudity. Man-made and natural simply are, and both have different functions, derivations, etc. To valorise nature over a product of civilization is no more than, to paraphrase LCD Soundsystem, borrowed nostalgia for an unremembered time (which is the essence of Heidegger's “The Question Concerning Technology”). At best, it is the philosophy of “I remember when all this was fields”. Now, returning to G.K. Chesterton, there is nevertheless a tension between these two, given that the natural is what we humans ever attempt to pacify and combat (Francis Bacon), to transcend (Ray Kurzweil). The status quo, on this cosmological perspective, is entropy. This is our opponent.

So, to leap back to where we were with this in mind, war/defence (choose your euphemism) is an example of this will-to-break-down that is the reality of life. This is a fairly awkward definition, but bear with me. It is inevitable, this destruction; there is but a difference of time-scales between the cosmological entropy of the second law of thermodynamics, and our own short-term analogy in social and political conflict. 
This is not an apologia for violence, but it is the recognition of a (deeply unpleasant) fact of our present social reality (and, contrary to arcadian myths of noble savages, it is our past too). What we continually do is find ways to reduce the prominence and frequency of this fact of strife in our lives. It is not geological fatalism to say that erosion happens (allow me this minor contradiction in terms), and so it should not be considered fatalism that to say that similarly a type of socio-political erosion will happen. As I am being crushed under the weight of this belaboured analogy I will just say we should erect defences to protect ourselves and our institutions from this decay. It is our duty to recognise the existence of conflict, strife, warfare – and to do something about it. Human activity, technology, ideas are a part of a concerted effort (better a musical than a martial metaphor) to diminish, in an on-going manner, destruction at our own hands, as well as obliteration by the forces of nature.

Now that I have lost just about everybody with this line of argument, I can return to philosophy and technology. War and destruction are not addressed in the philosophy and history of technology as the catalytic forces they are (only Schumpeter comes close with "creative destruction"). I find this fundamentally troubling that human ingenuity and the ideology of technology (Daft Punk's harder, better, faster, stronger) is so readily applied to killing and exploitation. This is the impulse behind this tortured argument. As such, I have attempted to see matters from a different perspective, considering ourselves on the Darwinian time-scale. Survival in the present moment in order to pass on genes to the future is what is of prime importance to the organism (there are limit situations where this does not hold, but they are exceptional). Thus, any new technology could follow a distribution whereby it will be applied to these immediate needs of survival, even if these needs are only apparent (this requires a whole subsidiary caveat whereby we note the corporate-media nexus, and how “need” is created as part of a market strategy). In my dangerously telescoped argument, rocketry begins as warfare, and becomes a tool of government and science, and expands into the civilian and commercial realm. What was once closed-source becomes marginally less so. Vast sums of money are spent on defence in the United States, partly because it is effectively a standing tradition of budgets in that country to spend the rest of the world (i.e., enemies) into submission, so inevitably a lot of that will filter into research and development.
The top right-hand corner is the amount spent by the U.S. on the Iraq war. Click to see full-size version.
A part of that then will go into military applications, and then a smaller amount of that will be considered viable for actual manufacture and deployment. Even if, as the argument goes, the space race was the Cold War-by-proxy, note the difference whereby the “by proxy” in effect negates “war”. Yet another example is of course the Internet as we know it today, which was developed by the military as a means of decentralizing communication which would ensure that in case of an attack, the system would continue to function.

Returning to the ethical argument, which I bracketed away at the beginning of all this, there is surely room for us to be involved in war/defence, and yet to be doing our best to reduce the possibility of it ever happening. We are aware of the genesis of the Gatling gun, and how its inventor hoped that it would end warfare by its very efficiency and the enemy's fear of deployment; this utopian view did not account for how cheaply politicians and generals regarded their soldiers' lives. That said, the war-gamers at the RAND corporation, DARPA, the NSA, each in their own way undertook the effort to make nuclear war an ever less attractive option via their development of a doomsday logic, namely the doctrine of Mutually Assured Destruction. (Interestingly, a friend of mine informed me that this was communicated to the Soviet side too, in both abstract and concrete terms, for M.A.D. could not have functioned unless both sides were aware of its rules, according to the nature of Game Theory). We can but hope that by developing M.A.D., this leveled out the curve of inevitability which saw an ever increasing number of fatalities with each global conflagration, with WWI surpassed by WWII, which all feared would be surpassed by a WWIII. Large scale war became an ever less attractive possibility, such that now wars are economic, diplomatic, and both of the above, as well as the actual wars which the superpowers waged either by proxy or by feigned ignorance in the second half of the c20th (see Niall Ferguson's The War of the World on this).


We should allow ourselves to be reigned in, then, by an ethics of technology. Let there be a stricture set down, following Mario Bunge's observation that technology's morals right now are dubious at best, whereby we will not develop any technology which is dedicated to our own oblivion. Let a philosophy of possible worlds (and not just Candide's “best of all possible worlds” which seems to govern the most rosy-VR-goggled futurists such as Kurzweil) consider the worst possible outcomes of each new technology.  

No comments:

Post a Comment