|
Post by Progenitor A on Mar 29, 2011 8:06:41 GMT 1
Personally I do not use Wiki as a reference source because I went to it once out of idle curiosity to look up something in my area of ex-expertise, and what I read was such rubbish that I vowed to avoid that infernal machine as much as poissible
It is difficult though, isn't it?
Type in almost anything , and as well as Amazon having a cut-price limited offer of ten of them, Wiki will inevitably come up
During my now-cleared-up puzzlement over entropy as disorder-to-order (although philosophically there are problems still) I went to Wiki a couple of times and here is one extract of what I read:
....... although life's dynamics may be argued to go against the tendency of second law, which states that the entropy of an isolated system tends to increase, it does not in any way conflict or invalidate this law, because the principle that entropy can only increase or remain constant applies only to a closed system which is adiabatically isolated, meaning no heat can enter or leave.
This ambiguity is implying that entropy of the disorder-to-order type can only arise in 'open' thermodynamic systems.
That is probably not what the aithors meant, but it is so badly written that that could be (and often is) the conclusion drawn when disorder-to-order type entropy is experienced
The mantra cry of bewilderment goes up, 'it cannot happen in a closed system- so it must be an 'open' system'. But it can - it happens all the time in 'closed' thermodynamic systems
Beware the wicked Wiki!
|
|
|
Post by speakertoanimals on Mar 29, 2011 13:40:17 GMT 1
First, you should really state that DECREASES in entropy (from disorder to order) can only arise in open systems. Hence not badly mean, and it means precisely what you want to believe it doesn't mean. If we try other texts on the same point, we can find the same thing -- entropy can only decrease in an open system, and the larger closed system of which that open system is a part has overall, either constant or increasing entropy. Another point that arose -- entropy ISN'T actually about order or disorder, although some popular science accounts phrase it like that! It is actually about probability. Just that for simple stuff, the probabilities go as out intuitive notions of order and disorder. If you don't like Wiki, we have instead this page from George Mason university: physics.gmu.edu/~roerter/EvolutionEntropy.htmIt's a nice short article (including some computations!), as why the anti-evolution faction have got it wrong. I really don't know why you persist in maintaining that common statements about open and closed systems are wrong, or insisting that 'order-disorder' entropy is somehow different to thermodynamic entropy. but given the deep relation between order-disorder and information theory (SAME probabilistic mathematical formulation for Shannon information and entropy after all), that at least puts your confusion and misunderstandind down as a repeated one. We also have hyperphysics: hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html Plus a particularly nice linked page: It's this multiplicity bit that links to information content. In those terms, a state of bits that is just one amongst MANY such states (ie any binary string of length N amongst the other 2^{N}-1 such strings) gives more states overall (2^{N} in fact), hence each individual state has a smaller entropy, hence higher information content. The information contant being related to the probability of ONE particular string amongst the ensemble (1 out of 2^N gives p = 2^{-N}, and information = -log p =N bits for any such string). In entropy terms, you have 2^{N} possible microstates, which gives entropy proportional to log (2^{N}), or proportional to N. Hence TWO strings eacn of length N contain TWICE as much information as one such string, and in terms of entropy, two systems either of which can be in one of 2^{N} possible microstates gives a multiplicity of 2^{N} TIMES 2^{N} = 2^{2N}, hence entropy of 2N, the sum of the individual entropies. SO, the link is multiplicity is M, probability of individual states is then 1/M, so information is - log p = - log(1/M) = log(M) which is proportional to the entropy k log (M),m where k is the Boltzmann constant, that gives the link between this stiatistical definition of entropy and the thermodynamic one. So, in short, if someone doesn't understand why a string of all zeros (from a system which can produce ANY possible binary string of the same length) contains AS MUCH information as any other binary string of the same length, no wonder they don't get statistical entropy either, because it's all the same stuff.
|
|