That Alien Message seemed to be an essay inside a story. I looked up some info about "AI in a box" that they linked to, and I thought this was interesting:
Nathan writes: I find it hard to imagine ANY possible combination of words any being could say to me that would make me go against anything I had really strongly resolved to believe in advance.
Eliezer writes: >Nathan, let's run an experiment. I'll pretend to be a brain in a >box. You pretend to be the experimenter. I'll try to persuade you to let >me out. If you keep me "in the box" for the whole >experiment, I'll Paypal you $10 at the end. Since I'm not an SI, I want >at least an hour, preferably two, to try and persuade you. On your end, >you may resolve to believe whatever you like, as >strongly as you like, as far in advance as you like.
Comments 2
That Alien Message seemed to be an essay inside a story. I looked up some info about "AI in a box" that they linked to, and I thought this was interesting:
Nathan writes:
I find it hard to imagine ANY possible combination of words any being could say to me that would make me go against anything I had really strongly resolved to believe in advance.
Eliezer writes:
>Nathan, let's run an experiment. I'll pretend to be a brain in a
>box. You pretend to be the experimenter. I'll try to persuade you to let
>me out. If you keep me "in the box" for the whole
>experiment, I'll Paypal you $10 at the end. Since I'm not an SI, I want
>at least an hour, preferably two, to try and persuade you. On your end,
>you may resolve to believe whatever you like, as
>strongly as you like, as far in advance as you like.
Nathan writes:
I decided to let Eliezer out.
Oh, man, what happened there?
Reply
Yes, that's right.
Oh, man, what happened there?
No one knows. It's one of the things that makes Eliezer so interesting.
Reply
Leave a comment