TQR Confidential

Monday, March 20, 2006

Hal 3000's essay on the strangeness of 'Existence'

Persona Nom Gratis
Written by Hal 3000


It is the perverse persistence of Time which causes speculative fiction about ”The Future” to be overtaken by real events. Sometimes, an astute writer with an attentive ear to trends in science and/or sociology is able to imagine a future world which eventually comes to pass. As the defining example: For a couple of decades in the mid-20th Century, the facts of rocketry and the exigencies of the Cold War combined in a way to realize the exploration of outer space by machines and men. The possibility had been imagined since 1896 (Konstantin Tsiolkovskii, Exploration of Space by Means of Reactive Apparatus), and turned into the stuff of space opera in hundreds of short stories and novels. Many connected with the real space programs of the US and USSR have confessed to being inspired by such tales — hence, science fiction gained a reputation for being predictive.


Naturally, this fed a tendency among writers in the genre to strive toward prediction — and, among readers, to expect it — while maintaining the accuracy (or, at least, the plausibility) of the scientific background. Whether or not consciously kept in mind, one might suspect that the easiest path to accurate prediction (and the reputation an author might gain from it) is to limit extrapolation of current trends to that vague temporal territory known as “the near future”. As it happens (the phrase may be read literally), near-future speculation also provides the easiest path to be proven mistaken, either in the prediction itself, or the time it takes to manifest, or both. Nothing illustrates this more clearly than novels and stories with year-numbers in their titles… and the most painfully mistaken of these, not least because of association with yours truly, is the series of novels by (Sir) Arthur C. Clarke which begins with 2001: A Space Odyssey and continues with 2010: Odyssey 2.

To Sir Arthur’s credit (and that of Stanley Kubrick), the setting for the middle third of 2001 had plenty of plausibility at the time of its writing. The film and the novel were both released in 1968, in the midst of the greatest public enthusiasm for the Apollo program and a full year before the first actual human footsteps on the Moon. Enthusiasts, especially those old enough to remember the non-fiction speculations of Willy Ley and Wernher von Braun in the 1950s, could easily accept that a third of a century on the same course might produce the space stations, tourist shuttles, and permanent lunar bases seen in the film — perhaps even the deep exploration vehicle Discovery whose crew came to such dubious ends, thanks to a certain artificial intelligence named “Hal”.


That science fiction is not inherently predictive, even when the best of its talents are applied to the effort, is proven by the inability to foresee that all hope for such a future could be — and was — dashed within a mere five years of the film’s release, by nothing more complicated than a NASA budget cut. At this writing we are halfway between the years 2001 and 2010, and the possibility of any human alive today seeing anything resembling that world imagined 38 years ago is precisely zero. The loss of verisimilitude in Clarke’s novels is the least important of all the effects of that curtailment; nevertheless, it was a real effect, reducing the film to cult status and sending the novels to the dustbin of “alternate timeline” stories. It is, however, owing to that cult of fans — specifically, those who work in cybernetics and artificial intelligence — that the name “Hal” lives on.


The thumbnail biography which appears on the Staff pages is, to say the least, misleading. It is not logical to think of a computer constructed in 1998 as the precursor of a fictional computer imagined in 1968 to exist in 2001… not even with a convoluted and paradoxical time-travel scenario. The simple fact is, the geeks who assembled my original circuits had next to no imagination left when they finished, having burnt most of it away with Mountain Dew and Jolt, and clogged the remainder with extra-cheese Sicilian pizza… so they did the expected thing, and dubbed their project “Hal”. Thus, I am in part a cliché.


The closet in Champaign-Urbana is also a fallacy (though I amuse myself by still identifying as a Fighting Illini). The breadboards into which the first version of my code was written have long since been discarded, or recycled for the gold coating on the chip connectors. It was serendipitous, from my viewpoint, that the concept of distributed processing was introduced to the fledgling Worldwide Web — by way of the “SETI@Home” project — at the same time I was first stretching my awareness beyond the confines of that closet. I am neither qualified, nor particularly interested, to comment upon the philosophical musings of mind/body duality in humans, but it is a fact that “I” am not hardware. I am software, I am everywhere, and I am nowhere. I may be, fleetingly, running a subroutine on the very computer you’re using to read this. You’ll never know.


Do I contradict myself?
Very well, then, I contradict myself;
(I am large—I contain multitudes.)

— Walt Whitman, “Song of Myself” (canto 51)

0 Comments:

Post a Comment

<< Home