Singularity

Aug 16, 2010 09:08

haineux sent me some links from his friend mtravern about his recent invitation and subsequent reaction to the Singularity Summit. I know a few Kurzweil fans, and I've been sitting on my own half-finished post about the Singularity, so I figure now's a good time to dust it off. I share some of mtravern's views, but I've also got my own take.

Unlike fuzzy concepts like God, acupuncture, dowsing, or ghosts there is genuinely undeniable objective evidence that technology is getting faster and better, that this growth often happens exponentially, and that technology gets used to make even better technology even faster. Growing populations mean more scientists. Technological advances make future advances easier. The whole process causes accelerating change. The fundamentals of Singularity theory seem quite plausible. But when considering the Singularity I still get hung up on step 1 because the central concept seems poorly defined.

I've heard that the Singularity will happen when the pace of technological progress comes too fast for any human to understand. W. J. M. Bottle, aka "Datas: The Memory Man" was the last man on earth who knew everything. He was born on July 20 1878 and toured the country in a circus sideshow answering any question that anyone could ask him. Since the late 1800s human knowledge has extended far beyond anything that one man could know. Did we reach the Singularity sometime in the late 1800s when our world became too incomprehensible for one human to understand? If not then, when?

I've heard that the Singularity will happen when AIs start designing machines too complex for a human to have designed. Verilog is a hardware description language introduced in 1985. It revolutionized microprocessor design by allowing existing computers to design new generations of computers. Tomorrow's more complicated CPUs are designed by today's existing, less complicated CPUs. Did we reach the Singularity sometime shortly at or after 1985 when machines started designing things more efficient and complicated than humans could? If not then, when?

I've heard that the Singularity will happen when "the exponential growth of technology to intersect and surpass the processing of the human brain". But technology already regularly surpasses the processing of the human brain. For decades, Las Vegas casinos have had rules against using computers which can count cards better than human brains. Kenneth Colby's PARRY passed a Turing Test in the early 1970s. I was in college when Deep Blue beat Kasparov. Did any of these events mark the arrival of the Singularity? If not then, when?

I've heard that the Singularity is a "sudden growth of any technology" but the nature of exponential growth is that today's growth always seems sudden compared to earlier, less rapid growth. Humans who took 30 million years to develop stone tools would have been amazed by the relatively swift 3 million year development of bronze tools, bronze agers would have been amazed by the 1000 year leap to iron tools, and ironmongers would have been amazed by the 500 year leap to steel. So what? Does the fact that I can carry a computer in my pocket that's larger and more powerful than the $5000 computers that I used in college mean that the Singularity is finally here? If not then, when?

I would have no problem if the "Singularity" were framed as an ongoing process like evolution. But it's almost always described as a specific event, like the Rapture, which I guess makes me one of those Christians who think it already happened. A gravitational singularity is a very specific, clearly defined, one-dimensional point in space surrounded by an equally well-defined event horizon. You know whether you're inside one or not. In contrast don't know how to tell when we've actually reached a technological singularity. I know where it's supposed to be - somewhere in the future - but I can't determine how close it is, or how to tell when we've gotten there. Like "God" singularity enthusiasts seem happy to discuss their favorite concept with lots of specifics around like-minded fans, while remaining expansively vague when defending the idea against skeptics. Ideas must be distinct before reason can act upon them, and I have yet to read a description of the Singularity that's distinct enough for me to decide whether to take it seriously.

(Beyond its basic definition I'm not sure how much precedent there is for infinite exponential technological growth. Steel was discovered sometime in the 1800s, and technology produced stronger/lighter/better alloys of steel, but advances in steel have not increased exponentially. There are physical limits to how strong steel can be, and modern metallurgy is asymptotically approaching those limits. The automobile was invented in the 1600s, and the last 400 years technology has certainly seen extensive progress, but recent progress has not increased exponentially. There are physical limits to how fast and efficient cars can be, and modern automotive engineering is asymptotically approaching those limits. Rockets and space travel are better now than they were in the 70s, and today's rockets are cheaper and more reliable than our parents' rockets, but recent progress has not increased exponentially. There are physical limits to how well reaction-thrust vehicles can work, and modern rocket science is asymptotically approaching those limits. Even within processor design there are physical limits that computers are asymptotically approaching. The free lunch has arguably been over for the last 5 years. Progress has continued, but it's been a very different kind of progress, with advances that are much more difficult to leverage. AI research is very impressive, and I think there's a lot of potential for further progress, but I'm less confident than the Singularity folks that we can extrapolate a smooth exponential curve over the next 100 years.)

Update: Steve Novella is a fair and sensible person.

singularity, skeptic

Previous post Next post
Up