The operating system of *culture* is language, but people are not wholly defined by their culture.
I guess I'm making a distinction between parts of what might be termed "the mind". I have a sense that when you say "a great deal of what we think of as reasoning is not linguistic" you're referring to things I'm not including when I talk of "mind", such as instinct, unconscious reaction, learned response etc.-which I see as tropic, albeit complex.
In low entropy, stable systems, small differences are forced back into the stable status quo. In high entropy systems, small differences are swamped by the statistical effect of many other small differences, ultimately evening out to be effectively quite predictable. It is the systems in between where the interesting, complex and chaotic behaviour, lies.
My argument is that the tendency for entropy to increase somehow drives the generation of complexity. I'm not attempting to formulate a mechanism to explain this (I lack the physics and the math, and probably the ontological clarity) but merely proffering it as a philosophical intuition-or, at least, a plausible plot element I've been playing with for a number of years. :)
Yes, I think you are conveniently excluding anything that doesn't fit into your theory from being part of the mind. possibly just because it doesn't fit into your theory :-) why shouldn't unconscious reactions, learned responses, etc be part of the mind?
Fundamentally, most of your memory and learning about the world is non-linguistic, and most of what your mind does is performed in a fundamentally non-linguistic way. We have quite a lot of interesting evidence for it, too. If memory is fundamentally non-linguistic (as surely it must be, given non-linguistic animals have memory), it would seem to be a bit of a blow, unless you are equally happy to reject memory as a necessary part of the mind as well. As a hard core computationalist, you seem to reject connectionist approaches (which would have more associationist models of memory than linguistic almost always) - but I think you are on the losing side of a debate that was fought and lost in the scientific arena about a decade ago.
My argument is that the tendency for entropy to increase somehow drives the generation of complexity. Well, I'm not saying that is necessarily wrong, just that it becomes wrong after a given point. Life exists somewhere between the extremes of the stagnation of Order and the disipation of Chaos, not just in Michael Moorcock books, but in maths as well.
merely proffering it as a philosophical intuition Philosophical intuition is notoriously unreliable. But I see no reason for that to stop it being a plausible plot element.
I think you are conveniently excluding anything that doesn't fit into your theory from being part of the mind.
No; my theory is specifically about consciousness. It may well be that the mysterious interaction between the "conscious mind" and the brain occurs via the "unconscious mind", which is the part of the brain in "closest contact" with the mind, but that doesn't make the unconscious part of the mind.
unless you are equally happy to reject memory as a necessary part of the mind as well
I am indeed.
Look, it's really not a difficult analogy: brain is hardware; mind is software. Memory is clearly in the hardware bit, although information (ie. "mind stuff") can be retrieved/derived from the memory to be manipulated in the symbol-spaces of the mind.
[Changes in entropy drive complexification:] it becomes wrong after a given point
I'm not sure I understand. I think you're interpreting me to be saying that absolute levels of entropy are related to absolute levels of complexity, which I'm not.
AHHHHH, you aren't talking about *thinking* at all, but about consciousness. Well, that changes things a lot. I wish you'd made that a bit clearer. To me, what the mind does is primarily think. That changes the tenor (if not the details) of discussion quite a bit.
But I think if anything, it makes this particular line of argument make even less sense. Consciousness seems pretty clearly not solely linguistic (and I'm pretty sure in previous discussions you rejected the sense in which a human like sense of consciousness must include self-awareness that is a natural consequence of linguistic capability). I am not self-aware because I keep up a linguistic running commentary to myself about what I am doing.
And actually, yes, it does become a difficult analogy as soon as you start trying to divide everything into hardware and software, yet putting most of the actual data storage and manipulation, and most of the actual information processing, etc into the hardware side. If you want to have a hardware/software divide, fine - but its not a simple analogy if you put almost all the things software does into the hardware side.
And as far as rejecting the unconscious as part of the mind -- if psychology has taught us nothing else, it is that introspection is a very poor tool for understanding what our mind is doing, yet you would appear to be treating our introspective analysis of what our mind is doing as both infallible and of deep philosophical import.
its not a simple analogy if you put almost all the things software does into the hardware side.
By definition, whatever isn't done by the hardware is done by the software. You seem to be arguing that there is nothing that isn't done by the hardware, and that the software is just a conversational convention: a convenient way to talk about what the hardware is doing.
No, *I'm* saying that absolute levels of entropy are related to absolute levels of complexity, though not linearly. Low entropy, 'cold' things lack complexity - nothing happens. High entropy, 'hot' things lack complexity - lots of things happen that statistically converge on the same place. Things that are in-between have complexity (I note Rudy Rucker describing what I'm calling complexity here as 'gnarliness').
This is neither contradicting nor supporting the idea that transfer of entropy creates complexity (which seems to have a lot of interesting theorising around it, taking a quick flick through the wikipedia article on entropy), it just puts an outer limit on it, and makes the model a bit more complicated.
I guess I'm making a distinction between parts of what might be termed "the mind". I have a sense that when you say "a great deal of what we think of as reasoning is not linguistic" you're referring to things I'm not including when I talk of "mind", such as instinct, unconscious reaction, learned response etc.-which I see as tropic, albeit complex.
In low entropy, stable systems, small differences are forced back into the stable status quo. In high entropy systems, small differences are swamped by the statistical effect of many other small differences, ultimately evening out to be effectively quite predictable. It is the systems in between where the interesting, complex and chaotic behaviour, lies.
My argument is that the tendency for entropy to increase somehow drives the generation of complexity. I'm not attempting to formulate a mechanism to explain this (I lack the physics and the math, and probably the ontological clarity) but merely proffering it as a philosophical intuition-or, at least, a plausible plot element I've been playing with for a number of years. :)
Reply
Fundamentally, most of your memory and learning about the world is non-linguistic, and most of what your mind does is performed in a fundamentally non-linguistic way. We have quite a lot of interesting evidence for it, too. If memory is fundamentally non-linguistic (as surely it must be, given non-linguistic animals have memory), it would seem to be a bit of a blow, unless you are equally happy to reject memory as a necessary part of the mind as well. As a hard core computationalist, you seem to reject connectionist approaches (which would have more associationist models of memory than linguistic almost always) - but I think you are on the losing side of a debate that was fought and lost in the scientific arena about a decade ago.
My argument is that the tendency for entropy to increase somehow drives the generation of complexity.
Well, I'm not saying that is necessarily wrong, just that it becomes wrong after a given point. Life exists somewhere between the extremes of the stagnation of Order and the disipation of Chaos, not just in Michael Moorcock books, but in maths as well.
merely proffering it as a philosophical intuition
Philosophical intuition is notoriously unreliable. But I see no reason for that to stop it being a plausible plot element.
Reply
No; my theory is specifically about consciousness. It may well be that the mysterious interaction between the "conscious mind" and the brain occurs via the "unconscious mind", which is the part of the brain in "closest contact" with the mind, but that doesn't make the unconscious part of the mind.
unless you are equally happy to reject memory as a necessary part of the mind as well
I am indeed.
Look, it's really not a difficult analogy: brain is hardware; mind is software. Memory is clearly in the hardware bit, although information (ie. "mind stuff") can be retrieved/derived from the memory to be manipulated in the symbol-spaces of the mind.
[Changes in entropy drive complexification:]
it becomes wrong after a given point
I'm not sure I understand. I think you're interpreting me to be saying that absolute levels of entropy are related to absolute levels of complexity, which I'm not.
Reply
But I think if anything, it makes this particular line of argument make even less sense. Consciousness seems pretty clearly not solely linguistic (and I'm pretty sure in previous discussions you rejected the sense in which a human like sense of consciousness must include self-awareness that is a natural consequence of linguistic capability). I am not self-aware because I keep up a linguistic running commentary to myself about what I am doing.
And actually, yes, it does become a difficult analogy as soon as you start trying to divide everything into hardware and software, yet putting most of the actual data storage and manipulation, and most of the actual information processing, etc into the hardware side. If you want to have a hardware/software divide, fine - but its not a simple analogy if you put almost all the things software does into the hardware side.
And as far as rejecting the unconscious as part of the mind -- if psychology has taught us nothing else, it is that introspection is a very poor tool for understanding what our mind is doing, yet you would appear to be treating our introspective analysis of what our mind is doing as both infallible and of deep philosophical import.
Reply
By definition, whatever isn't done by the hardware is done by the software. You seem to be arguing that there is nothing that isn't done by the hardware, and that the software is just a conversational convention: a convenient way to talk about what the hardware is doing.
Reply
This is neither contradicting nor supporting the idea that transfer of entropy creates complexity (which seems to have a lot of interesting theorising around it, taking a quick flick through the wikipedia article on entropy), it just puts an outer limit on it, and makes the model a bit more complicated.
Reply
Yeah; haven't read his "Everything is Alive" stuff yet. Sometimes he can be a little gonzo for my tastes.
Reply
Reply
Leave a comment