Something I don't get about Singularity theory

Dec 13, 2015 00:15

I'm reading Ray Kurzweil's The Singularity is Near,  and there's something that I don't get about the theory of exponential growth, or even exponentially growing exponential growth.  And that's physical or historical flukes.

This cycle of machine intelligence’s iteratively improving its own design will become faster and faster. This is in fact ( Read more... )

singularity, emergence, mnemoscene

Leave a comment

Comments 2

eclectic_boy December 13 2015, 05:44:18 UTC
I could see that happening to a widely-dispersed civilization, but I don't think the Singularity concept needs to be evenly distributed across an entire race. As I understand it, the argument is that even a relatively confined area with a small percentage of said civilization in it could "go singular", and suddenly (from our perspective, at least) have access to far greater capabilities and resources. I chose the words that I put in quotes because I'm making the analogy that you don't need to have all the uranium in the country go critical to get an immense explosion in the fraction which does.

So a Singularity might involve just part of a civilization changing phase from human to superhuman. But I want to ponder the possibilities of disasters (natural or self-inflicted) causing a Singularity to fail, sputter like a hand-cranked biplane propeller... suppose it takes many tries before a Singularity actually get airborne?

Reply

emsariel December 14 2015, 14:23:09 UTC
That makes sense. Thank you for helping me to think it through. That thing you want to ponder is in fact why I'm wrestling with this, now. First, I think that while it's well and good to look at the big picture raises in civilizations, the lived, on-the-ground experience of those advancements is going to be fragmentary, unequal, conflict-ridden. The *stories* are in how it's not smooth.

Including the story that I'm starting to write, which depends on a chunk of human civilization getting really close to a technological singularity and freaking out about what's on the other side, sabotaging themselves into backing down from it. But, then, the real story lies in what remains - how far you'd have to break things down in order to have the plane not take off immediately on the next try, and whether the failure of that biplane engine could nevertheless lead to a different flying machine later on.

Reply


Leave a comment

Up