Using the Label

Sep 11, 2012 21:46

To state it clearly: I am a singularitarian. Of the small-s variety, anyway.

Not one of the near-future types, not a Singularity by 2045 kook like Kurzweil. Nor am I one of the exponential-tech-growth types, again not like Kurzweil. And though it has nothing direct to do with the singularity, I also do not believe that cryo, even if it could work, is necessarily a good bet on a societal scale (though it's a good insurance investment for certain kinds of personalities). Nor do I believe there is anything inevitable about maintaining a stable enough society into the future that we can preserve and expand our current level of tech while we try to build an AI. Thermonuclear war, or the swine/bird/crocodile flu, is always a possibility.

However.

Intelligence can be engineered. We know this because nature accomplished intelligent thought out of mindless matter using a process much less efficient than engineering.* We might not ever manage it, but it can be done. And if we can eventually engineer a general intelligence, then we can subsequently learn to engineer an intelligence greater than ours. If we can learn to engineer an intelligence greater than ours, then the infant super-intelligence itself will be able to engineer even greater intelligences, newer versions, faster and better than we could do. This new version will then be able to duplicate the process, again faster than we could do. It could take a long long long time, but it would still be faster than evolution. And these intelligences would likely be scalable in a way that human brains are not by nature. I can't attach a backup brain to my own if I need to solve a particular problem more quickly, but a computer can add a backup processor.

This is all completely obvious, so obvious I didn't even think it was worth stating, except I came across some conversations recently where people were in complete befuddled denial. There is no certainty about any of this coming to pass, nor is there any certainty about how long it might take. We don't even understand the real nature of the problem yet. But it is a definite possibility, and it is an obvious game changer.

Being able to engineer intelligence is the single most important technology we could possibly develop. A big honking word like singularity was chosen to hang on this concept.

Obviously, lots of susceptible folks pursue this because of a religious impulse. It is a literal version of the Holy Grail, potentially attainable with real human science. It is truly a Rapture for Nerds, and that's what tends to make the whole discussion so distasteful, ironically from both the zealously religious and zealously non-religious. One side presumably doesn't like the competition, and the other side doesn't like the woo. Faith-based thinking takes over, and we get outlandish claims about how very soon the Golden Age of the Future might arrive. What I have belatedly realized is that this stench of woo often drives otherwise intelligent people away from thinking about the core of the singularity. And the Basic Idea, that it is possible and therefore important, deserves a bit more attention.

I'm not advocating that someone like PZ Myers should stop lambasting Kurzweil when Herr Shortwhile says stupid stuff. What I'd appreciate, though, is some semantic nuance from him and others like him. If he saysAt some point, I expect artificial intelligences to be part of our culture, if we persist; they’ll work in radically different ways than human brains, and they will revolutionize society, but I have no way of guessing how.
then it doesn't make any sense for him to also say
But I do not believe in the Singularity at all.
because his first quote there is the very definition of the singularity. The small-s version, anyway. This is a guy who's a proud atheist, remember. Given the sadistic semantic contortions that that particular word is constantly subject to, you'd think he would be slightly more sensitive about other similarly vulnerable labels. This is an important concept, and it needs a label. People have to have a flag to rally behind, and this cause is worthy. Yes, those followers can be kooky. So what? It's often (though not always) the good kind of kooky. They try to write code and find proofs for computer science theorems to increase our efficiency in answering questions by making better use of available processing power. Go them. We need a word to know them by, and singularity is what we got.

If the woo has won out, if the well has been poisoned, if the word's connotations are permanently altered... then I guess I'm willing to move on to a different label. There are certain other terms that I deliberately avoid. I could be wrong, but in my estimation, far too many clever people don't appreciate the power of branding. We could call it the AI Event Horizon or whatever...

...hey, I actually like that quite a lot...

Okay maybe I'm an AI Event Horizonist. The point is, if we can manage to create intelligences, which can in turn create intelligences, then our lives will indubitably be weirded beyond our current comprehension. This is worthy cause type stuff, and it should be treated as such, even given the woo that naturally is attracted to it. Pursuit of this particular Grail should be encouraged.

*If you don't know about the laryngeal nerve, you should. "This is not an intelligent design." "No engineer would ever make a mistake like that."

It took evolution more than three and a half billion years to come up with intelligent life from the first replicators. It's about a billion years if you start from multicellular life. Obviously, if given the opportunity, we can do better than that. Some of the more optimistic AI folks think we can manage the same in less than 200 years. In other words, they think engineering is 7 orders of magnitude more efficient with this problem.

I am not convinced, though I might be persuaded of 6 orders of magnitude if 90% of the human race doesn't succumb to an epidemic in the next thousand years.

ETA: I don't recall personally seeing it before (not that that means anything), but "Event Horizon" for the tech singularity is in regular use. Not surprising. I really don't read much about it because there genuinely isn't anything else to say about it. Eventually it might happen, and if so, shit's gonna get real. That's about it.

Previous post Next post
Up