I can't give you what you want...

Oct 23, 2011 20:01

This post is going to be a little scattershot but it ties in with enders_shadow's recent post and the monthly topic ( Read more... )

fiction, finance, opinion

Leave a comment

Comments 28

mrbogey October 24 2011, 03:16:36 UTC
Gold standard vs Fiat currency is a bit different than the thrust of the argument you seem to be making. But I don't believe I know the nuances well enough to explain it. But you seem to be talking about money as an absolute value if used with a gold standard when a gold standard does no such thing.

It goes back to the root of what "money" is and isn't. It's a common medium of exchange that has value either intrinsically or by fiat. But it's value is never absolute.

Reply

sandwichwarrior October 24 2011, 04:42:15 UTC
But I'd be willing to bet that the average rank and file supporter would not be able to make that distinction.

Reply

Gold has NO "intrinsic value" montecristo October 24 2011, 05:18:59 UTC
Gold has objective, intrinsic properies to which people impute value. I think all we are seeing in your post is an incomplete understanding of the concept.

There is no intrinsic value. Value is entirely subjective and context dependent. Fiat currency is just a mechanism allowing the person or persons with fiat authority over it to in effect veto the value imputations of consumers and traders acting in the market. That is why statists want a "valueless" fiat currency, because it gives them power over others.

Read Hans Hermann Hoppe: Why the State Demands Control of Money for an understanding of money and how it works and why those who run the State always want control of it.

Reply

Re: Gold has NO "intrinsic value" sandwichwarrior October 25 2011, 18:56:38 UTC
"What the statists want" doesn't even enter into it.

Currency by it's very nature is devoid of any objective or intrinsic value.

Reply


meus_ovatio October 24 2011, 03:38:35 UTC
If by "writing a sci-fi novel" you mean "penning an ideological fillibuster", you might want to rethink your project.

Reply

sandwichwarrior October 24 2011, 03:49:43 UTC
I'm not Ayn Rand.

Nor do I want to be.

Reply

a_new_machine October 24 2011, 03:56:18 UTC
Eh, sci-fi has a lot more room for ideology than many other genres, because it's so often about society and humanity in an era of . This lends itself to navel-gazing and philosophy.

Reply

sandwichwarrior October 24 2011, 04:40:38 UTC
Well that's kind of the point aint it?

Reply


Keep going; you're onto something montecristo October 24 2011, 05:28:54 UTC
The more I think about it, the more I have become convinced that goods or labor have no absolute value. Thier value is entirely dependant on how badly someone wants them.

You are close to a fundamental truth here. "How badly someone wants" something is measured in terms of what they are willing to forgo in order to have it. Every last human action is an expression of a decision involving trade-off.

If you're exploring value in this direction, you may want to examine the concepts you've identified in comparison to what the Austrians have written, specifically, what Mises wrote about a study called praxeology. A good, very accessable tutorial on the subject is a series of YouTube videos, most less than five minutes in length, created by a user named Praxgirl. The series starts with a four and a half minute introduction and has thirteen episodes so far.

Reply


(The comment has been removed)

sandwichwarrior October 24 2011, 19:12:58 UTC
I agree that currency is "ultimately valued in other things"m but I think that a lot of what people assume to be universal is in fact not. For instance, outside of certain industrial applications what "inherent worth" does gold actually have?

Not a whole lot. It is only valuable because people want it.

The jumping off point for the whole project was a series of conversations Chris and I had about what would an artificial or alien intelligence actually want? Would traditional human concepts of value even apply to them?

Reply

(The comment has been removed)

sandwichwarrior October 24 2011, 20:52:42 UTC
Can I ask how prevalent AI is in your world, and what kind of AI it is? A sentient computer, or an automaton?Not very, and they are more like the "HAL9000" variety of sentient program then Asimovian Robots ( ... )

Reply


(The comment has been removed)

(The comment has been removed)

sandwichwarrior October 24 2011, 20:55:55 UTC
The Moon is a Harsh Mistress is one of my favorite books of all time and has provided a good deal of inspiration.

Reply

sandwichwarrior October 24 2011, 21:05:51 UTC
To get to the guts of this question I think you'll need to spend a while fleshing out what motivates an AI, and what it seeks to achieve by getting [whatever has value to it].

That is the question that lead me down the tangents described in the OP.

I still haven't figured it out.

Is there some goal it would destroy itself to accomplish - or which, once achieved, it would simply shut itself off just like any other program exiting on a success condition?

I actually considered this as a way to disable more "Militant" AI's, give 'em what they want and then zap 'em when they drop into standby mode. ;P

The first 3 or 4 chapters of Metamorphosis of Prime Intellect are really good explorations on this theme and I bet it will inspire you to think about AI psychology and motivations in new ways.

I'll need to look into this.

Reply


Leave a comment

Up