Singularity

Sep 26, 2008 10:03

So the meeting on the Singularity will be next Friday (info to that is linked here along with all the stuff from last meeting). The topic is the Singularity. I'll be out of town to take my LSATs the following morning. And honestly, I'm kind of happy about that.

I don't like talking about the Singularity.

I'll tell you why, of course.

The Singularity is, in my opinion, (to quote Sarah) just another apocalypse meme. Libby calls it the Transhumanist Sky Wank. Any sticking point that comes up in discussion, anything that can't be explained or anticipated is too often responded to with "well, when the Singularity comes..." and not only do I think that's irresponsible to a drastic degree, I don't think it will be anything like that. There is never one answer.

But I'm getting ahead of myself. What is the Singularity? At least in the studies I've done, it refers to something like the singularity of a black hole - a force so powerful we can only measure its affects to a certain point, and only predict them based on what we know to another point, and beyond that, we have no idea.

So, supposedly, we evolve at an exponential rate so far as tools are concerned. It's even argued that we evolve though abilities (as linked to tools but also biological changes) exponentially. Which looks something like this:


(it should be noted that the exponents are what is changing - if you were to use evenly spaced integers instead of exponents *cough*I'mrustyatmath*/cough* it would look more like this, in which case you have a "Knee" to the curve. The Singularity is the Knee because that's when things start to shoot straight upward). Supposedly when we hit the knee of the curve (nearly) everyone is in constant Future Shock (more on that in a later entry, or now if you ask nicely) and shit hits the fan. The chosen few who keep up with tech magically ride the wave and become tech gods.

Because the tech we build allows us to build better tech, and we're developing AI, self-improving software, self-replicating machinery, and nanobots, some people feel that we're at this Knee. Some see this as OMGH4XTEHFUTORZ!!1!, other see it as an apocalypse where nearly everything dies and/or just most of humanity except a chosen few and/or (insert Matrix reference here).
And while mostly this just makes me think of The Rapture (and just as unlikely, silly, and importance-inducing - "we live in the End Times, everything has been building up to us and nothing will ever follow! DOOOOM"), It's also not what we're like as humans. What we do is survive. At all costs. If something is too big and scary and we're not ready for it, it doesn't happen. Thus the debates over whether or not figureheads in history were actually geniuses (most of the ideas they put forth had been fully disseminated at some point prior but the world just wasn't ready. Also case in point: David Bowie).
Also, we're good at adapting. Our tools are ways of doing this. If things are moving faster, we're going to make ourselves go faster too.

Really what it boils down to is that we construct tools to better deal with the environment we've created. This alters our environment so we construct more relevant tools. Rinse and repeat. Why would we end up with tools for an environment that doesn't exist?

We don't get a clean break to start anew in an Age of Utopia. We have to build it with the crap we've got now. Don't wait for some great Future. Because it's now. Chose what tech you use wisely. Make it work for you. Appreciate marketing for marketing and then figure out what actually works and don't buy into something you don't believe in.

discussion group, singularity

Previous post Next post
Up