Error Condition

I've been listening to session recordings from the recent Singularity Summit. One of the speakers, Peter Norvig from Google (just a few blocks from the Mountain View coffee shop where I'm now typing this) addressed the speculative concern that Google's vast array of computer systems might spontaneously combust into some sort of consciousness, a la movies like The Forbin Project or Ghost in the Shell.

To my surprise, they have already spent some billable time on this, thinking about what to monitor: unexplained and unattributed traffic between nodes, across disks, and so forth. In fact they seem to be ready to shut it down at any time. Which implies that to Google, which is itself an entity that's already behaving according to its own logic as a rational economic being: intelligence, in a computer, is an error condition.

Comments on "Error Condition"

Ed Nixon
November 6, 2007 08:24 AM

Just came back to this from your current post (Nov6). I think it's very strange that there is so much confusion among putative scientists about what models are good for. It seems to me that much of the more... optimistic stuff that is said about computers and consciousness and/or intelligence arises out of a mistake: confusing explanatory models with operational models. And, of course, I can't resist pointing to my old hobby horse, John R. Searle, and the venerable battle waged with the Pinkers, etc. over these issues -- the letters & discussion pages of the New York Review of Books is a great source of entertainment on this (as well as much else.) Maybe this is the error condition you're referring to. :-)



All content on is 1994-2017 by Kevin Bjorke. All Rights Reserved.