|
Entropy
Oct 15, 2014 13:04:04 GMT 1
Post by abacus9900 on Oct 15, 2014 13:04:04 GMT 1
When scientists talk of "entropy" what exactly are they referring to and how does this concept enable science to confidently assert that time can never travel backwards?
|
|
|
Entropy
Oct 21, 2014 19:36:39 GMT 1
Post by rsmith7 on Oct 21, 2014 19:36:39 GMT 1
Move into a new house and from that point forward, the crap always gets more. That's the best explanation I've heard.
|
|
|
Post by alancalverd on Oct 23, 2014 0:07:50 GMT 1
We tend to think of entropy as randomness or disorder, but it's actually defined as probability.
|
|
|
Entropy
Oct 23, 2014 11:50:13 GMT 1
Post by abacus9900 on Oct 23, 2014 11:50:13 GMT 1
Move into a new house and from that point forward, the crap always gets more. That's the best explanation I've heard. I see. So I guess what you are saying is that entropy is a measure of disorganisation.
|
|
|
Entropy
Oct 23, 2014 11:52:44 GMT 1
Post by abacus9900 on Oct 23, 2014 11:52:44 GMT 1
We tend to think of entropy as randomness or disorder, but it's actually defined as probability. You mean like when an object is released from a given height it usually falls to the ground and doesn't rise up?
|
|
|
Post by mrsonde on Oct 25, 2014 6:43:33 GMT 1
No, he doesn't mean that - or if he did, he's mistaken. Neither did Mr.Smith - he was making a joke.
It refers to the tendency of energy in a closed system to dissipate. In thermodynamics this is measurable over time given more than one state in that system. Because of those multiple states - actual or possible - that measurement can be specified probabilistically. That's what Alan means, I should think.
In Information Theory it refers to something else, which is the tendency for coherent distinctions to get loss when they're transmitted through any particular medium and means through time. Again, because you're dealing with actual and possible states - of the coherence, and the ways it can dissipate - it can be measured using probability: not surprisingly, the equations derived are entirely congruent.
This mathematical congruence does not mean there is a thing in the universe that is being named by the same metrical marker in those equations. Any more than there's a thing called "ten" that you detect when you count ten apples or ten reasons not to get married.
Most physicists don't get this. They're not philosophers. Hence the nonsense about time, time's arrow, the inevitable heat death of the universe, and so on. They've completely forgotten the measure of probability has already included the direction of time in being made in the first place. They think they can do so without it, because the calculations seem entirely abstract - numbers and lines drawn in phase space - but this is like saying a dance exists without its steps. Energy or "order" tends to dissipate through time; time doesn't occur because energy or order tends to dissipate. It's already occurring in the "tends" bit.
|
|
|
Post by abacus9900 on Oct 26, 2014 8:57:33 GMT 1
mrsonde, you're being somewhat vague. Speak plainer please.
You cannot just give a textbook definition of a scientific concept and leave it there because the idea is that people come here to learn, not to be confused. If you had to teach you would have to structure your lectures better, so come on. I could simply Google entropy and be confused that way, TYVM.
|
|
|
Post by mrsonde on Oct 27, 2014 7:43:39 GMT 1
mrsonde, you're being somewhat vague. Speak plainer please. You want me to be more precise but speak plainer? Errr...can't be done. Beyond how I've described it, it's a concept that is entirely mathematical. I can give you the equations if you want, but that won't explain it any further. The fact is it is a "vague" concept. As I say, it's not something that exists in the universe - it's a name given to a process at most, and moreover one that only applies in very specific circumstances (it is not applicable wherever the system is not closed, for instance; nor wherever an organising principle is at work, such as in living creatures; nor in any system complex enough to have a self-organising component (such as the atmosphere, the solar sytem of a star, probably the interior of any star, and so on.) In fact, it is not applicable on any scale where gravitational attraction or any of the other fundamental forces of nature is a significant operational force - so you could argue nowhere at all, in the real world. In practice, the equations work, however, because it's used in communications applications, or in thermodynamic analysis of working machines. The difficulties arise when this mathematical concept, satisfactorily derivable in such analyses, is assumed to be some sort of universal principle. It's understandable why this assumption was and is made, given its practical application in all these situations - but as I say you have to bracket self-organisation if you want to argue it is such an effective principle in the very working of the universe, including the gravitational field. If you don;t want to be confused, remember that every time you hear or read a physicist talking of the "Law of Entropy" or projecting it out into the universe as some sort of inescapable mechanical principle. Also - it might help you avoid such confusion if you remember that, as Thomas Merton used to say about God, it's not a noun but a verb.
|
|
|
Entropy
Oct 27, 2014 11:45:22 GMT 1
Post by abacus9900 on Oct 27, 2014 11:45:22 GMT 1
mrsonde, you're being somewhat vague. Speak plainer please. You want me to be more precise but speak plainer? Errr...can't be done. Beyond how I've described it, it's a concept that is entirely mathematical. I can give you the equations if you want, but that won't explain it any further. The fact is it is a "vague" concept. As I say, it's not something that exists in the universe - it's a name given to a process at most, and moreover one that only applies in very specific circumstances (it is not applicable wherever the system is not closed, for instance; nor wherever an organising principle is at work, such as in living creatures; nor in any system complex enough to have a self-organising component (such as the atmosphere, the solar sytem of a star, probably the interior of any star, and so on.) In fact, it is not applicable on any scale where gravitational attraction or any of the other fundamental forces of nature is a significant operational force - so you could argue nowhere at all, in the real world. In practice, the equations work, however, because it's used in communications applications, or in thermodynamic analysis of working machines. The difficulties arise when this mathematical concept, satisfactorily derivable in such analyses, is assumed to be some sort of universal principle. It's understandable why this assumption was and is made, given its practical application in all these situations - but as I say you have to bracket self-organisation if you want to argue it is such an effective principle in the very working of the universe, including the gravitational field. If you don;t want to be confused, remember that every time you hear or read a physicist talking of the "Law of Entropy" or projecting it out into the universe as some sort of inescapable mechanical principle. Also - it might help you avoid such confusion if you remember that, as Thomas Merton used to say about God, it's not a noun but a verb. In a nutshell, you might have said nature tends towards disorganiztion, based on probabilities, and that in organized systems, such as you cited, the entropy is low or, put another way, the probability of disorganiztion is low due to the ability of such a system to maintain a consistent function. In information theory it is the loss of meaningful information in transmission. But doesn't even organized systems eventually succumb to entropy?
|
|
|
Entropy
Oct 27, 2014 14:03:14 GMT 1
Post by alancalverd on Oct 27, 2014 14:03:14 GMT 1
In a nutshell, you might have said nature tends towards disorganiztion, based on probabilities, and that in organized systems, such as you cited, the entropy is low or, put another way, the probability of disorganiztion is low due to the ability of such a system to maintain a consistent function. In information theory it is the loss of meaningful information in transmission. But doesn't even organized systems eventually succumb to entropy? You might say it, but then someone will point out that gases (essentially random) and liquids (highly disorganised) tend to crystallise into a highly ordered state, so you would be wrong.
|
|
|
Entropy
Oct 27, 2014 16:50:05 GMT 1
Post by abacus9900 on Oct 27, 2014 16:50:05 GMT 1
You might say it, but then someone will point out that gases (essentially random) and liquids (highly disorganised) tend to crystallise into a highly ordered state, so you would be wrong. Well, fine, but then you can't just say that without further explanation, otherwise we might think you're simply quoting from Wikipedia.
|
|
|
Post by alancalverd on Oct 28, 2014 0:53:50 GMT 1
Why would I bother to quote from an unattributed source when it's obvious to anyone who has seen snow fall. or grown a crystal from a saturated liquid, or even watched salt being extracted from brine. If you dispute the ordered nature of crystals, or the disordered nature of gases and liquids, you could resort to x-ray diffraction to investigate these states of matter, but I doubt that you will come to a different conclusion.
|
|
|
Post by abacus9900 on Oct 28, 2014 9:23:16 GMT 1
Why would I bother to quote from an unattributed source when it's obvious to anyone who has seen snow fall. or grown a crystal from a saturated liquid, or even watched salt being extracted from brine. If you dispute the ordered nature of crystals, or the disordered nature of gases and liquids, you could resort to x-ray diffraction to investigate these states of matter, but I doubt that you will come to a different conclusion. Crystallography isn't the subject we are discussing.
|
|
|
Entropy
Oct 28, 2014 15:09:19 GMT 1
Post by mrsonde on Oct 28, 2014 15:09:19 GMT 1
You want me to be more precise but speak plainer? Errr...can't be done. Beyond how I've described it, it's a concept that is entirely mathematical. I can give you the equations if you want, but that won't explain it any further. The fact is it is a "vague" concept. As I say, it's not something that exists in the universe - it's a name given to a process at most, and moreover one that only applies in very specific circumstances (it is not applicable wherever the system is not closed, for instance; nor wherever an organising principle is at work, such as in living creatures; nor in any system complex enough to have a self-organising component (such as the atmosphere, the solar sytem of a star, probably the interior of any star, and so on.) In fact, it is not applicable on any scale where gravitational attraction or any of the other fundamental forces of nature is a significant operational force - so you could argue nowhere at all, in the real world. In practice, the equations work, however, because it's used in communications applications, or in thermodynamic analysis of working machines. The difficulties arise when this mathematical concept, satisfactorily derivable in such analyses, is assumed to be some sort of universal principle. It's understandable why this assumption was and is made, given its practical application in all these situations - but as I say you have to bracket self-organisation if you want to argue it is such an effective principle in the very working of the universe, including the gravitational field. If you don;t want to be confused, remember that every time you hear or read a physicist talking of the "Law of Entropy" or projecting it out into the universe as some sort of inescapable mechanical principle. Also - it might help you avoid such confusion if you remember that, as Thomas Merton used to say about God, it's not a noun but a verb. I might have said that, if I wanted to make totally ungrounded postulations that have no empirical support whatever. But I didn't. I wanted to explain to you what "Entropy" means. That's what you asked, remember? No - it's what I told you it was. The presence or absence of meaning has nothing to do with it. You're talking as though it were a thing, an effective agent, acting in the universe. It isn't. It's an abstract mathematical component derived from the way that distinctions dissipate through time when unbounded by organising constraints. By definition, this does not and can not apply to organised systems - the organisation is the constraint. Whether all such organisation eventually breaks down - no evidence for that, is there? On the contrary, it seems to get more and more ubiquitous and robust.
|
|
|
Entropy
Oct 28, 2014 15:16:03 GMT 1
Post by mrsonde on Oct 28, 2014 15:16:03 GMT 1
Why would I bother to quote from an unattributed source when it's obvious to anyone who has seen snow fall. or grown a crystal from a saturated liquid, or even watched salt being extracted from brine. If you dispute the ordered nature of crystals, or the disordered nature of gases and liquids, you could resort to x-ray diffraction to investigate these states of matter, but I doubt that you will come to a different conclusion. Crystallography isn't the subject we are discussing. It is. As I said, this is not a derivable mathematical concept in any situation where the organising and self-organising forces, such as the fundamental ones of nature, have any appreciable operation.
|
|