I'm reading Ray Kurzweil's The Singularity is Near, and there's something that I don't get about the theory of exponential growth, or even exponentially growing exponential growth. And that's physical or historical flukes.
This cycle of machine intelligence’s iteratively improving its own design will become faster and faster. This is in fact
(
Read more... )
Comments 2
So a Singularity might involve just part of a civilization changing phase from human to superhuman. But I want to ponder the possibilities of disasters (natural or self-inflicted) causing a Singularity to fail, sputter like a hand-cranked biplane propeller... suppose it takes many tries before a Singularity actually get airborne?
Reply
Including the story that I'm starting to write, which depends on a chunk of human civilization getting really close to a technological singularity and freaking out about what's on the other side, sabotaging themselves into backing down from it. But, then, the real story lies in what remains - how far you'd have to break things down in order to have the plane not take off immediately on the next try, and whether the failure of that biplane engine could nevertheless lead to a different flying machine later on.
Reply
Leave a comment