Aug 11, 2008 00:30
Why am I excited about the Singularity? Why do I want to be part of what makes it?
Leads me to:
Th singularity is like the imagined and hopeful heroic moment; after this, nothing is the same - but the crucial thing that struck me long, long ago about the Hero Moment is that, well, in a ll honesty, what did you do to deserve it?
Classic example: How often have you hoped for something to happen from someone so you could save them from it; the classic damsel in distress, knight to the rescue, true love ever after?
Only, the hero, in the stories that actually have length, never realizes that moment in which they become the hero. The concept of having a fate to be a hero always struck me as odd: heroes aren't fated to become it; they are simply unable to do anything else and remain who they wish to be. Gandhi wasn't dragged along by an unseen hand; he stood there one day and said, I will do this, for to not do it is to become something abhorrent. And yet we could say it was fate, and really, how can you tell the difference? (On a side note, I am currently unconvinced that it's possible to determine what model of time we exist in, although I can see on the horizon some tests that might pan out)
Which segues back nicely to the original topic of the singularity; that theoretical moment after which we cannot predict, thought to be brough about by the advent of strong AI - Intelligence capable of making a more intelligent version of itself.
Only, and here's a thought - don't we already do that? Or, rather, try? If you assume (or only measure those places at which) we are getting better at educating our children, we are then strong AI - we are created more intelligent versions of ourselves (Which leads to the idea that perhaps the "trigger" for the singularity is an intelligence who is either only capable of making a more intelligent version of itself - since we do seem so good at going "downhill" so to speak - or at least one that, most of the time, produces something more intelligent than itself, recursively, which leads nicely into - ). In one sense, haven't we been doing that since day one? The classic interpretation of "evolution" is that it results in intelligent life (Tau takes issue with this, and I'm inclined to trust him); more intelligent = more evolved (which is a fallacy - perhaps diversity results in intelligence); one could then model the singularity as occurring once there is an intelligent entity capable of reproduction such that entities more capable of making more intelligent versions of themselves are selected for - So, if we make nerds our sex symbols, do we have the singularity?
And on another note - If we start with the thought that perhaps "singularity" is a misnomer, we either go someplace interesting or have a better idea of what it really means - I have trouble imagining a point in time at which you cannot predict what the next moment will bring, especially if we assume it's universal (that is, no intelligence taking part in the singularity will be able to see past it), I don't know... I just can't imagine that, but maybe later...
It also doesn't make a lot of sense if we take my position that the singularity actually occurred eons ago, when humanity first started irrevocably becoming more intelligent, but it does present the interesting idea that perhaps the singularity is an entity-local phenomenon - I don't think my grandmother can really predict what the next ten years will bring (Hell, I have doubts that I could), so for her you might say the singularity has occurred.
So, think about the horizon. That's what we're talking about, really. The horizon is the point past which you cannot see, and the singularity is when the distance to the horizon is zero. Only, that can't ever happen - which means, we can never really reach the singularity.
Even including transtemporal effects! Information from the future is only as accurate as it is testable, and I can't currently think of a way to avoid using non-transtemporal effects to do that testing.
This thought is less interesting than I at first supposed, and in the end I think I just the option of reversible immortality, which is really me having this issue with the reality that I only live one life.
(Ahhh, good insanity is really when your model of reality fits all available data, and other people don't share that model. I could assume that this life is the life I'm living so that I know what it's like not to know I've lived more lives, and you'd call me crazy. But you couldn't disprove it.)