Saturday, February 7, 2009
Sarah Connor: "Look... I am not stupid, you know. They cannot make things like that yet."
This weekend, I finally finished reading James Trefil's book entitled Are We Unique? This is a very accessible, introductory text for anyone interested the study of human consciousness. You can even find it in the Frazar library (BF 444 t74 1997). Trefil is the Robinson Professor of Physics at George Mason University and has appeared on NPR several times over the years. He also has a mustache like Don Mattingly.
As a scientist, Trefil chooses consciousness and intelligence as the defining characteristics that make human beings distinct from both animals and computers. A great majority of the book is dedicated to the idea of computers nearing anything close to consciousness or artificial intelligence. Needless to say, Searle and the Chinese Room example are written about extensively. Trefil ends up contextualizing his own theory of consciousness within a larger general theory of complexity, defining consciousness as an emergent property of neuronal complexity. Like a philosopher, he ultimately leaves the question of artificial intelligence open (but his tone is more than skeptical).
After I finished reading the book, I couldn't help but wonder why human beings are so threatened by the idea of artificial intelligence? What is so sacred about intelligence as opposed to any other ability that human beings have? We have built machines that can "run" faster (cars) and are stronger (forklifts) than human beings, yet we don't feel threatened by these machines. However, if Deep Blue beats Kasparov in a chess match we start to question our own uniqueness.
Is it evolution anxiety? Do human beings simply fear becoming a place holder between animals and intelligent machines? Supposing we could create a conscious machine, would human beings cease being unique? After all, Australian orchids are not conscious entities but are still rather unique.
Or is it just a control issue as opposed to an issue of uniqueness? James Cameron seems to think so. We build the machines and the machines eventually become more powerful and kill us (See The Terminator, Terminator 2: Judgment Day, Terminator 3: Rise of the Machines, Terminator Salvation forthcoming)
What do you think?