r6 and I discuss his theory that entropy is subjective

Jun 01, 2005 03:18

r6 and I discuss his theory that entropy is subjective

I've never been satisfied with the solutions I've seen to Maxwell's Demon.
I take r6's interpretation of entropy as an agent-dependent quantity related to his knowledge, and a measurement of what one can do with this knowledge: knowledge is power. According to his theory, an all-knowing being ( ( Read more... )

physics, phil.sci

Leave a comment

gustavolacerda June 1 2005, 08:56:52 UTC
Thanks for the comment.

I briefly looked over the thread, and I noticed you asked "is entropy defined for a macroscopic particle". I think I know what you're asking here, but I'm not sure so correct me if I misinterpretted. I think what you're asking is: does entropy apply to classical systems which have no constituants small enough for quantum mechanics to play a role? And the answer is yes. Entropy is an entirely classical concept;

What I meant was "macroscopic" in the sense ping-pong balls: if we had a huge box in space (zero-gravity vacuum), with billions of ping-pong balls bouncing around in it, could we define entropy based on these observable states?
The problem for me is that it's not clear how you count states: at what level of detail do you look?

It seems to me that entropy is a "statistical" law, rather than a "physical" one: the 2nd law says that any system will tend to end up in the more probable set of states; and without knowledge of the system, one cannot bring it into a less probable set of states (these differences in probability being huge)

Does this make sense?

Reply

darius June 1 2005, 14:23:59 UTC
ping-pong balls: Yes, though in practice the entropy from microscopic variation would be much greater still.

level of detail: See http://en.wikipedia.org/wiki/Entropy#Counting_of_microstates

Reply

spoonless June 1 2005, 18:48:22 UTC

What I meant was "macroscopic" in the sense ping-pong balls: if we had a huge box in space (zero-gravity vacuum), with billions of ping-pong balls bouncing around in it, could we define entropy based on these observable states?

Yes, it's perfectly well defined. But only up to an additive constant which has to do with how finely grained things are. You can always add a constant to the amount of entropy in a system and it's not going to change the dynamics. As long as you're consistant as to what constant you add, it doesn't matter. Kind of like defining a "ground voltage". In quantum mechanics, there's a natural definition for what that constant should be, but that need not be essential to the theory of statistical mechanics; it can be seen just a convenience issue.

It seems to me that entropy is a "statistical" law, rather than a "physical" one: the 2nd law says that any system will tend to end up in the more probable set of states; and without knowledge of the system, one cannot bring it into a less probable set of states (these differences in probability being huge)

Yes, that's right... it's a law that comes entirely from statistics. There are some very basic physical assumptions that go into proving it, but other than that it's entirely a consequence of deductive reasoning, not something that just happens to be true in our physical world.

Reply

gustavolacerda June 2 2005, 10:30:59 UTC
Yes, that's right... it's a law that comes entirely from statistics. There are some very basic physical assumptions that go into proving it

which physical assumptions?

Reply

spoonless June 2 2005, 15:41:44 UTC
Well, the main one is "ergodicity". There are a few other technical ones I think, but I wouldn't know off the top of my head. Only a few experts on the subject are concerned with what axioms are necessary to prove it, whereas most physicists are satisfied with it as long as it works. It wasn't proven rigorously until a long time after it was accepted as a standard law. And there are, I think, still debates on which axioms are the best ones to use.

Ergodicity is the main one though. Ergodicity means that you have to be dealing with a system which spends an equal amount of time in each "state" in the long run. Or, if its phase space is continuous... then the probability of it being in a particular region has to be proportionate to the volume of that region of phase space. ("phase space" is the 2n-dimensional space where position and momentum are the axes and n is the dimensionality of the regular "position" space--typically 3). It might make sense to say that ergodicity is just saying you need to define what a state is in a sane way. You can construct systems which don't satisfy ergodicity, but you could always say that that's just because you haven't labelled the states correctly.

Reply


Leave a comment

Up