Futurology is the art of predicting the technology of the future.
N.B. We say “futurology” because the term “futurism” denotes the Italian aesthetic movement “Il Futurismo”: it began with manifestos – Manifesto del Futurismo (1909) which glorified the technology of the automobile and its speed and power followed by two manifestos on technology and music, Musica Futurista (1912), L’arte dei Rumori (1913). The movement’s architectural esthetic can be appreciated at Rockefeller Center; its members also included celebrated artists like Umberto Boccioni whose paintings are part of the permanent collection of the MoMA in New York.
We live in an age of accelerating technological forward motion and this juggernaut is hailed as “bearer of the future.” However, deep down, genuine distrust of science and “progress” has always been there. Going back in history, profound discomfort with technology is expressed in the Greek and Roman myths. The Titan Prometheus brings fire to mankind but he is condemned by the gods to spend eternity with an eagle picking at his liver. Vulcan, the god of fire, is a master craftsman who manufactures marvels at his forge under the Mt. Etna volcano in Sicily. But Vulcan is a figure of scorn: he is homely with a permanent limp for which he is mocked by the other gods; though married to Venus, he is outrageously cuckolded by his own brother Mars; for Botticelli’s interpretation of Olympian adultery, click HERE .
More recently, there is the myth of Frankenstein and its terrors. Then there is the character of the Mad Scientist in movies, magazines and comic books whose depiction mirrors public distrust of what technology is all about.
For all that, in today’s world, even the environmentalists do not call for a return to more idyllic times; rather they want a technological solution to the current crisis – for example, The Green New Deal. Future oriented movements like Accelerationism also call upon free-market capitalism to push change harder and harder rather than wanting to retreat to an earlier bucolic time.
The only voluble animosity towards science and technology comes from Donald Trump and his Republican spear carriers but theirs is opportunistic and dishonest, not something they actually believe in.
Futurology goes back at least to the 19th century with Jules Verne and his marvelous tales of submarines and trips to the moon. H.G. Wells too left an impressive body of work dealing with challenges that might be in the offing. In a different vein, there are the writings of Teilhard de Chardin whose noosphere is a predictor of where the world wide web and social media might be taking us – one unified super-mind. In yet another style, there are the books of the Tofflers from the 1970s such as Future Shock which among other things dealt with humanity’s dealing with the endless change to daily life fueled by technology, change at such a speed as to make the present never quite real.
For leading technologist and futurologist Ray Kurzweil, for the Accelerationists and for most others, the vector of technological change has been free-market capitalism. Another vehicle of technological progress, to some a most important one, is warfare. Violence between groups is not new to our species. Indeed, anthropologists point out that inter-group aggression is also characteristic of our closest relatives, the chimpanzees – so all this likely goes way back to our common ancestor. The evolutionary benefit of such violence is a topic of debate and research among social scientists. The simplest and most simple-minded explanation is that the more fit, surviving males had access to more females and so more offspring. One measure of the evolutionary importance of fighting among males for reproductive success is the relative size of males and females. In elephant seals, where the males stage mammoth fights for the right to mate, the ratio is 3.33 to 1.0; in humans it is roughly 1.15 to 1.0 – this modest ratio implies that the simple-minded link between warfare and reproductive success cannot be the whole story.
Historically, the practice of war has hewn closely to developments in technology. And warfare, in turn, has made demands on technology. Indeed, even men of genius like Archimedes and Leonardo da Vinci developed weapons systems. However, the relationship between matters military and technology became almost symbiotic with WWII. Technological feats such as nuclear power, rockets, missiles, jet planes and the digital computer are all associated with the war efforts of the different powers of that conflict. Certainly, the fundamental research and engineering behind these achievements was well underway in the 1930s but the war efforts determined priorities and thus to which areas of technology should resources and funding be allocated, thereby creating remarkable concentrations of brilliant scientific talent. The Manhattan Project itself is studied as a model of large scale R&D; furthermore, the industrial organization of the war period and military operations such as countering submarine warfare gave rise to a new mathematical discipline, aptly called Operations Research, which is now taught in Business Schools under the name Management Science.
In his masterful treatise War in the Age of Intelligent Machines (1991), Manuel DeLanda summarizes it thusly: “The war … forged new bonds between the military and scientific communities. Never before had science been applied at so grand a scale to such a variety of warfare problems.”
Since WWII we have been in a “relatively” peaceful period. But the technological surge continues. Perhaps we are just coasting on the momentum of the military R&D that followed WWII – the internet, GPS systems, Artificial Intelligence, etc. However, military funding might be skewing technological progress today in less fruitful directions than capitalism or science-as-usual itself would. Perhaps this is why post WWII technological progress has fueled the growth of paramilitary surveillance organizations such as the CIA and NSA and perfected drones rather than addressing the environmental crisis.
Moreover, these new technologies are transforming capitalism itself: the internet and social media and big data have given rise to surveillance capitalism, the subject of a recent book by Harvard Professor Emerita Shoshana Zuboff, The Age of Surveillance Capitalism: our personal behavorial data are amassed by Alexa, Siri, Google, FaceBook et al., analyzed and sold for targeted advertising and other feeds to guide us in our lives; this is only going to get worse as the internet of things puts sensoring and listening devices throughout the home. The 18th Century Utilitarian philosopher Jeremy Bentham promoted the idea of the panopticon, a prison structured so that the inmates would be under constant surveillance by unseen guards – click HERE . To update a metaphor from French post-modernist philosopher Michel Foucault, with surveillance technology we have created our own panopticon, one in which we dwell quietly and willingly as our every keystroke, every move is observed. An example: as one researches work on machine intelligence on the internet, Amazon drops ads for books on the topic (e.g. The Sentient Machine) onto one’s FaceBook page !
The futurologists and Accelerationists, like many fundamentalist Christians, await the coming of the new human condition – for fundamentalists this will happen at the Second Coming; for the others the analog of the Second Coming is the singularity – the moment in time when machine intelligence surpasses human intelligence.
In Mathematics, a singularity occurs at a point that is dramatically different from those around it. John von Neumann, a mathematician and computer science pioneer (who worked on the Manhattan Project) used this mathematical term metaphorically: “the ever accelerating progress of technology … gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.” For Von Neumann, the singularity will be the moment when “technological progress will become incomprehensibly rapid and complicated.” Like Von Neumann, Alan Turing was a mathematician and a computer science pioneer; famous for his work on breaking the German Enigma Code during WWII, he is the subject of plays, books and movies. In 1951, Turing wrote “once the machine thinking method has started, it would not take long to outstrip our feeble powers. … At some stage therefore we should have to expect the machines to take control … .” The term singularity was then used by Vernor Vinge in an article in Omni Magazine in 1983, a piece that develops Von Neumann’s and Turing’s remarks further: “We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the center of a black hole, and the world will pass far beyond our understanding.” The concept of the singularity was brought into the mainstream by the work of Ray Kurzweil with his book entitled The Singularity Is Near (2005).
In Kurzweil’s writings, he emphasizes how much technological development is growing exponentially. A most famous example of exponential technological growth is Moore’s Law: in 1975, Gordon E. Moore, a founder of INTEL, noted that the number of transistors on a microchip was doubling every two years even as the cost was being halved – and that this was likely to continue. Amazingly this prediction has held true into the 21st Century and the number of transistors on an integrated circuit has gone from 5 thousand to 1 billion: for a graph, click HERE . Another example of exponential growth is given by compound interest: at 10% compounded annually, your money will more than double in 5 years, more than quadruple in 10 years and so on.
Kurzweil argues that exponential growth also applies to many other areas, indeed to technology as a whole. Here his thinking is reminiscent of that of the French post-structuralist accelerationists Deleuze and Guattari who also view humanity-cum-technology as a grand bio-physical evolutionary process. To make his point, Kurzweil employs compelling charts and graphs to illustrate that growth is indeed exponential (click HERE ); because of this the future is getting closer all the time – advances that once would have been the stuff of science fiction can now be expected in a decade or two. So when the first French post-structuralist, post-modern philosophers began calling for an increase in the speed of technological change to right society’s ills in the early 1970s, the acceleration had already begun!
But what will happen as we go past the technological singularity? Mystère. More to come.
5 thoughts on “Toward the Singularity”
I think you have the writing skills for a novel but can you be that good with AI too?
You mathematicians are amazing! Why are none of you running for president?
As far as the singularity goes some pieces seem to be unknown, forgotten or ignored by its prophets:
1) At some point in diminishing computer-component size (how many transistors can fit on a chip) the Gaussian predictability of large numbers of molecules doing X in time Y becomes the probability of that behavior in that period of time. There are limits to usable material performance enhancement based on scale.
2) In material terms– and many believe that is all man IS– our inventions (and, later, inventions by our inventions) and material modifications of our own material bodies to include those inventions, as Kurzweil et al predict: They may indeed produce a super-material “something” looking/acting like us… or not, surely superior to us in knowledge manipulation. But the singularity-shift to a new form of human? Not. For we are not material only, but a hybrid material-spirit being, not one of them alone nor both of them fused. Terminator sci-fi tales at least get that much of the future technology arc right.
1) already applies to Moore’s Law, pending a breakthrough in chip design. But the true believers think that will occur soon.
2) is the crux of the matter. Humanity’s central place in the scheme of things was challenged by Copernicus, then Darwin and now AI. It survived the first two.
A recent novel by Dan Brown, DIGITAL FORTRESS, predicts the future of human evolution to soon be hybrid human-computers. I think Ken has missed his calling- disseminating his messages through novels, not blogs, would increase readership and income.
Hmmm – food for thought. The real challenge would be to train an AI program to churn out sci-fi best-sellers while we all go to the beach !
Comments are closed.