Book Review: The Singularity is Near
Reviewed by: Bart Leahy
Title: The Singularity Is Near: When Humans Transcend Biology
Author: Ray Kurzweil
List Price: $18.00
By 2045, our computer, genetic, nano, and robotics technologies will have advanced exponentially, to the point where they will exceed human intelligence and, in effect, grant us nearly anything we want. That is the essential premise of this book of technological optimism by scientist and inventor Ray Kurzweil.
Addressing issues as diverse as brain chemistry, future warfare, virtual reality, and programmable blood, Kurzweil gleefully describes how such a transcendent leap in technology will be possible—mostly by developing machines that “reverse-engineer” the processes of the human mind. Because electronic minds are inherently faster than human brains, Kurzweil feels that eventually machines will become so advanced that we will be able to duplicate and then “upload” human consciousness. This would allow human beings to experience the faster thinking processes through speed-of-light computing while computers gain the programming knowledge to mimic, replicate, and then exceed human intellectual capabilities, leading to what is called the Technological Singularity.
This is all in keeping with “Moore’s Law,” which states that computing ability will continue to double every two years or less for the foreseeable future. As our computers expand their capabilities, they will enable us to develop smart blood cells; physically transform our bodies; connect our senses directly into full-sensory virtual realities; instantaneously increase our knowledge; and eventually enable us to become electronically, if not physically, immortal. As machines reach down farther and farther into the quantum levels of computing, all the matter of the universe would eventually be capable of storing knowledge. This, in essence, would make the entire universe conscious and would mean that humans would be creating the functional equivalent of God.
The book has only nine chapters, some of which take longer to read than others. Kurzweil begins by describing the “six epochs” of civilization, which culminate in his Singularity. Chapter two overwhelms the reader with examples and charts depicting the exponential development and progress of computer hardware. Chapters three and four discuss the computing capacity of the human brain and how scientists are attempting to reverse-engineer it. This is where the book drags, especially if one is not particularly interested in brain chemistry or computing. Chapter five discusses the revolutions occurring in “GNR” (genetics, nanotechnology, and robotics/computer intelligence). Chapters six and seven play with the ideas and implications of the Singularity and discuss why Kurzweil declares, “Ich bin ein Singularitarian.” After all the boosterism of the first seven chapters, chapter eight offers a remarkably balanced discussion of the promise and peril of GNR, and even offers some potential solutions, but Kurzweil has little doubt that any perils can be overcome through more, not less, development. And finally, he finishes with a long chapter entitled “Response to Critics,” in which he takes on the various arguments against why the Singularity is impossible.
As a fan of science fiction, I have no problem believing that the Singularity is possible, and as an English major, I would not in any case try to argue with Kurzweil about the feasibility of his ideas. I would question instead the desirability of his scheme and, more importantly, its potential impact on space settlement.
Kurzweil addresses space activities in three contexts: space solar power (SSP), interstellar exploration, and the potential for an asteroid strike upon the Earth. In the first case, he speculates that SSP could be made more efficient by manufacturing solar cells at a nano-level, enabling more surface area to capture energy from the sun. In the field of interstellar exploration, Kurzweil rejects the utility of sending “mushy” bodies through space, preferring to rely on nano-machines and robotic hardware.
Kurzweil quickly dismisses the possibility that an asteroid strike could wipe out technology-building civilization. He states that a “planet killer” would be detected early enough to be stopped. This ignores the current state of anti-asteroid technologies and the potential of a much smaller object smacking into the Pacific and wiping out the centers of most of the world’s high-tech centers with tsunamis (as depicted in the novel Lucifer's Hammer). High technology might continue, but at a much reduced scale, and in the aftermath of such an event, machinery like conscious computers and nanotechnology could become luxuries. The “Singularity” might be near, but it is not inevitable.
There are, of course, advantages to a “post-human” future, where human consciousness would be enhanced by links to super-intelligent computers, and where nanotechnology can create nearly any substance in mass quantities. Single-stage-to-orbit spacecraft might be designed and built for example. Materials could be extracted more efficiently from the lunar soil or the asteroids. Human beings could be genetically modified to survive on Mars, or nano-machinery developed to rapidly terraform Mars. Space advocates, should they be so inclined, might pursue partnerships with nanotechnology, genetics, or artificial intelligence-based companies to pursue space-settlement-related efforts. They also could advocate for using these technologies on Earth and other worlds in an ethical manner.
However, space advocates also must consider the potential for mischief with such godlike technology. Stories ranging from Greek mythology to “The Sorcerer’s Apprentice” to “Frankenstein” to the works of Michael Crichton have warned us about the dangers of hubris. We also must consider the dangers of secondary effects or “the law of unintended consequences,” which affects any technological decision we make today. Technological optimism of Kurzweil’s sort is slowly giving way to technological pessimism bordering (as he says) on feelings of anti-technology. Again, I am not denying the probability of Kurzweil’s technological Singularity; I simply question his uncritical acceptance of it. Give human beings machines capable of doing anything and they will, for good and evil. Kurzweil seems to dwell too much on the good to give the evil a second’s thought.
Those who argue against the space program usually assert that we should make life perfect here before we go on to other worlds—and The Singularity Is Near seems to promise exactly that. However, would perfecting life here remove the very incentive that makes us want to go? If so, this could be supremely dangerous to the human species. The Singularity might be indeed be near. The question space advocates must ask is: do we want it to be?
© 2007 Bart Leahy