Time To Blow the Dust Off...

Jul 20, 2010 00:52

As I write these words, most of my clothes are in the dryer. I waited a couple days too many, so I had to dig pretty deep in my dresser to find jeans to wear while washing the ones I actually like. I'm wearing a pair of ridiculous skinny hipster-jeans right now, and it occurs to me that pairs of pants are really a lot like aspirations. It's ( Read more... )

Leave a comment

Comments 10

ncarraway July 20 2010, 05:30:43 UTC
Here's one future as I see it going down. You're going to make games. You're going to make incredible games, because you'll bring your writing skills to bear as well as your coding skills. You're going to become well known. Meanwhile, you're going to communicate. Your games may feature well-meaning programmers who create something too powerful for them to control; your time in the public spotlight will be used for matters of grave import. If there's one thing that the AGI effort needs, it's careful, detailed, press that becomes impossible to ignore. Everyone with the skills and motives to be seeking AGI needs to be perfectly, painfully conscious of the risks, and no matter how many brilliant minds are working on the right project, all disaster requires is for another team's minds to be one iota less careful. There cannot be too many voices spreading that word; there cannot be too many media in which to spread it. You can spread it through games because someone has to do so ( ... )

Reply


la_flechette July 20 2010, 05:52:41 UTC
On a technical note: I've mostly figured out what you're talking about (between Foster's comment, LW, and your post itself), but your topic really isn't clear here. My response on first reading the post was "Wait, so what's the question? What risk, changing career paths?"

I will ponder your question, and respond later, but meanwhile: A vague disclaimer plan is nobody's friend.

Reply


myopian8 July 20 2010, 10:14:35 UTC
I *still* have no idea what's going on.

Or rather, I have some idea, but it is nowhere near clear.

Do YOU even have a clear idea? When I am in dis dress, I often don't. But if you do, can you help a sister out?

Reply

lonelyantisheep July 21 2010, 01:25:48 UTC
Heh, I think my idea of what's going on was clear enough that big chunks of it didn't seem to need explanation; I've edited my post a bunch and hopefully it will make more sense now.

Reply


From Claire's Philadelphia Roommate Dan anonymous July 20 2010, 22:48:06 UTC
Hey Sam ( ... )

Reply


myopian8 July 22 2010, 11:32:21 UTC
I am an xkcd nihilist. In my opinion, you should do whatever would make YOU happy and fulfilled. If you would always feel that you were missing something by not devoting your life to making video games, you shouldn't do that. Likewise, if you would always feel that you were wasting your life by making video games because there's some possibility you could do something to save humanity, you shouldn't do that.

Have you considered this third possibility: making friendly AI that has applications to video games (which you could be involved in implementing) and to other, more practical problems? Seriously, I don't think this is as black-and-white a career choice as you do. You as a research lab head could partner with a video game development group that implements the video games if you choose to focus more on the research side, or you as a programmer could partner with a research lab to test the usefulness of various AIs with real-life implementations and bring in funding. DO NOT discount the power of funding!

tl;dr You can have it all.

Reply

lonelyantisheep July 22 2010, 16:48:22 UTC
I consider my happiness and fulfillment to be acceptable sacrifices if that will achieve a sufficiently huge moral goal, which existential risk reduction and/or FAI certainly qualify as. (The former probably reduces to the latter ( ... )

Reply

la_flechette July 22 2010, 22:29:02 UTC
I don't have an answer for you, of course, but a couple of things occurred to me that I would add to your deliberations (if they aren't already there):
*The likelihood of you making a significant contribution. You've definitely alluded to this, but specifically if the "6 billion people * tiny chance of disaster > 1" utility function is a large part of your motivation, it's worth considering "6 billion people * tiny chance of disaster * possibly tiny chance that my efforts will change the likelihood of disaster >? 1" as well ( ... )

Reply

lonelyantisheep July 23 2010, 01:25:52 UTC
Believe me, I'm not about to start working feverish 16-hour days and entirely ignoring my own well-being; as you pointed out, it wouldn't help.

As for my ability to make a significant contribution, well, this is where we start to get into territory usually called "arrogance."

I am very, very intelligent. I acquire and integrate knowledge quickly and easily; I kick ass at many varieties of formal reasoning. I have accomplished nothing particularly exceptional, but for the vast majority of my life I was afflicted by the related conditions of ADHD and a lack of motivation. Between Adderall and a desperate, all-consuming moral imperative, I think I can turn that shit around and do something meaningful.

That's my hypothesis, at least. I won't find out until I test it.

Reply


Leave a comment

Up