|
Post by Progenitor A on Jan 26, 2011 14:09:12 GMT 1
Irrelevant? The very paper and the very lines where shannon defines information content? And you seriously think ANYONE is going to be fooled by your claim that this is irrelevant to a question involving information content? I see that you haven't actually attempted to SHOW that it is irrelevant, just hope that if you keep slinging the mud, others will be so wowwed by your intellectual brilliance that they won't bother to read it and judge for themselves. Thankfully, not everyone is a stupid as you (apart from abacus, if he even is a separate entity). The problem is not that you're so stupid that it isn't worth arguing with you, the problem is that you're either so stupid that it is worth posting against you to try and stop others being fooled by your drivel, or so vindictive, that you know it is drivel but post it anyway. I have no 'intellectual brilliance' I just happen to know a little about information theory because that has been my profession for the last 40 years. I have no intention of arguing with a person that maintains an empty information store contains as much information as one filled with encoded data. You totally misread Shannon in the most stupid way imaginable If anyone is puzzled (and surely they must be) as to why a 'phycisist' can assert that a store empty of information contains as much information as a store where we are storing, for example the encoded Encyclopaedia Brittanica, then I am quite willing to explain how Shannon treats that empty store and shows that it requires, at the most, just 1 bit to transmit its contents, no matter how big the empty store is, then please ask. I can also, if it is wished for show how many bits are required to transmit the contents a store filled with encoded data I will not argue with ignorant foolishness.
|
|
|
Post by speakertoanimals on Jan 26, 2011 15:07:42 GMT 1
You're still wrong, and unable to admit it!
The distinction is very simple -- when I stated it originally, I was talking about arbitrary binary strings (no MEANING, just arbitary, random, binary strings). As Shannon clearly states, that is the first case he considers, and the information content of ANY such string is the SAME, whether it is all zeros, or a random sequence of 0's and 1's.
The example you cite is a more complicated case, and you're stupid too see it, because you probably have ni idea whatsoever how Shannon goes from the totally random case, to that case. As I said before, a string that is encoded english is NOT random. A memory store that is being used is NOT the same as a random string of bits, since the nature of what you will find in memory stores is use is not random, there are correlations which are absent in the initial Shannon case.
I have explained all this in detail on the other thread -- and you are too daft to see that you don't get the basics! What I said (a random string of length M that contains all zeros contains as much information as any other string) is TRUE, and you can't derive the results for correlated strings, or non-iid strings (such as real memory stores, telephone messages etc etc), unless you understand that.
I don't care a tinkers cuss about your claims that it has been your professiojn for the last 40 years -- academics know that often those who use a thing, and claim to be able to teach it, often don't understand tha basics of the subject in any rigorous mathematical or scientific sense. Which you seem to have shown is the case with information theory as well. So, you mayUSE Shannons results, but you don't understand where they come from, or the mathematical basics of the theory, that is obvious else you wouldn't keep claiming my statement about random strings is nonsense, a misunderstanding etc etc.
Well, for starters you have glossed over the fact that empty is not quiteb the same as all zeros! Naughty, naughty!
Second, you have to explain the difference between random strings, and memory stores designed to store (say) encoded english text. As I KEEP SAYING clothears, the stuff in such a memory store is DIFFERENT to the random strings case, in that you have a different context -- english as opposed to random strings or random sets of letters.
So, if we are storing random strings of letters and punctuation marks (including spaces), then a string of all spaces (which some would call empty) contains as much information as ANY other random charcter sequence of the same length.
But if we specify further that we are storing english text, then the all spaces case can be dismissed, because that is a sequence that does not occur in english. Hence probability zero. So all you need, in effect, is one bit to say -- nothing in store/something in store, text follows.
Which is NOT the same as saying -- amongst the set of all possible binary random strings, 00000 contains as much information as any other bihnary string of the same length.
I have explained at length, with numerous arguments and examples, quotes from Shannon etc etc -- and all you have to offer is the usual -- I've been doing this for 40 years and you're wrong --no proper attempt at argument, no attempt to engage, no effort to try and understand the position of the other person. No willingness to admit your own knowledge may be limited. In short, all the hallmarks of a crap teacher -- I pity your supposed studentsm, they may get some sort of rough and ready working knowledge, but as regards the fundamentals of the subject they'll be totally screwed!
|
|
|
Post by carnyx on Jan 26, 2011 15:29:36 GMT 1
STA
Your error was to assume that NM was stupid, whereas in fact it was you who were actually meta-stupid. You failed (and it is an observed failing of yours) to work out where NM was coming from ... in other words you failed to understand properly what was being meant. Context is all, as I know you appreciat.e
And as it is rare IRL to find someone with that kind of righteous blindness who is still 'at large', as it were, it may be that you are in fact playing a sophisticated game of 'gotcha'.
Predators play this game out of necessity; so what is your need?
|
|
|
Post by abacus9900 on Jan 26, 2011 15:29:46 GMT 1
You see, maybe STA is saying something important, who knows? The big problem is it's put over in such an incomprehensible way there's no way of knowing!
Normally, a string of zeros, no matter how many there are, is still zero so what STA has in mind remains a mystery.
|
|
|
Post by speakertoanimals on Jan 26, 2011 15:57:39 GMT 1
You misunderstand.
Let me give you an example. Suppose the message I am sending is the position of a point on the real line between 0 and 1. I am encoding this position as a binary number.
So first I send 0
this DOESN'T mean the position is zero, just that it is LESS THAN a half.
Then I send another zero. Again, this doesn't mean the position is zero, just that it is between 0 and 1/4
Another 0 sent narrows it down to between 0 and 1/8.
And so on. After N zeros, what you now know is that it is between 0 and 1/2^N.
Now think about sending the number 0.000000000000000000000001
I can't just send the 1, that on its own could mean 0.1 or 0.001. I need to know WHERE the 1 is, which the job the leading zeros have.
I can't just send the NUMBER of leading zeros. Why not?
In binary 1/8 is 0.001 -- 3 bits to send, 001, since we take the starting point as agreed to beforehand. 8 = 2^3, which also gives 3 bits as the message length.
If I want to send (2 leading zeros then a 1), I have to first send 2 in binary (10), then the 1, hence still 3 bits. Hence we might as well send the number itself (001) as above.
And I have to send the last bit as well, since I want to make sure I distinguish 001 from 000.
Hence zeros is NOT the same as ZERO especially when it comes to binary or decimal fractions -- how many 0's before the non-zero digits matters, as I hope the initial example makes clear.
|
|
|
Post by speakertoanimals on Jan 26, 2011 16:06:31 GMT 1
TO give another example which doesn't use zeros -- suppose I want to send a message which is telling someone where I am on a line, from A to B. They know which line I'm talking about, they just need to know where on it I am. I am equally likely to be anywhere on it.
First question : left half or right half of the whole line?
I send L or R
Second question: of that piece, the left or the right side of it?
I send L or R
And so on.
As you can hopefully see, the more letters I send, the more closely they can pin down my position -- but they need ALL the letters to find me. And as the previous case shows, sending (6 Lefts then a right), uses just as many bits as sending (LLLLLLR).
But any string of lefts and rights contains as much information as anyn other -- they ALL allow myn position on the line to be determined to the SAMe degree of accuracy, even if they are all lefts or all rights.
|
|
|
Post by Progenitor A on Jan 26, 2011 16:06:59 GMT 1
You see, maybe STA is saying something important, who knows? The big problem is it's put over in such an incomprehensible way there's no way of knowing! Normally, a string of zeros, no matter how many there are, is still zero so what STA has in mind remains a mystery. There is no mystery. She took a stance, that the amount of information conveyed does not depend on change (this is her normal knee-jerk reaction to any scientific ststement that anyone makes on this board -WRONG), extended this, quite stupidly to saying a binary store full of zero's contains as much information as a similar store full of encoded data and is making an utter fool of herself by attempting to defend that position. One thing is quite clear, she is not a physicist. She relies on the ignorance of others (myself included) to attempt to get away with paraphrasing scientific papers that she does not understand. Naturally the paraphrasing contains many facts, but her writing is normally incomprehensible simply because she does not comprehend herself. She then seeks to suppress criticism or valid questions by hurling insults at anyone that questions what she has written She is an anti-scientist, a fraud. This is harsh but her treatment of others is appalling
|
|
|
Post by speakertoanimals on Jan 26, 2011 16:44:56 GMT 1
The only fools are those not prepared to go back to the original material (shannons paper that I provided a link to and quoted from), and those not prepared to consider the many examples I have given, all of which show quite clearly that a string of repeated zeros (or repeated rights) contains just as much information as a string of mixed symbols.
Thinking this is not the case is an elementary mistake that many people make when they first come across the subject. But they usually go away and think a bit, and finally understand why this is the case.
What you seem to think this has to do with science is a mystery, since this is either maths or computer science, not physics as such.
Quoting is not paraphrasing, and I note that no one has so far tried to actuallyn explain what Shannon meant if he didn't mean what I said he meant. Plus where is you evidence that what I have provided above is paraphrased from ANYWHERE, rather than being examples that I thought up on the spot (as I did, actually).
But where they came from makes no difference -- the point is do they prove what I claim they do, or not?
Expecting no attempts at argument, just the usual tired insults.................
|
|
|
Post by Progenitor A on Jan 26, 2011 16:52:29 GMT 1
The only fools are those not prepared to go back to the original material (shannons paper that I provided a link to and quoted from), and those not prepared to consider the many examples I have given, all of which show quite clearly that a string of repeated zeros (or repeated rights) contains just as much information as a string of mixed symbols. Thinking this is not the case is an elementary mistake that many people make when they first come across the subject. But they usually go away and think a bit, and finally understand why this is the case. What you seem to think this has to do with science is a mystery, since this is either maths or computer science, not physics as such. Quoting is not paraphrasing, and I note that no one has so far tried to actuallyn explain what Shannon meant if he didn't mean what I said he meant. Plus where is you evidence that what I have provided above is paraphrased from ANYWHERE, rather than being examples that I thought up on the spot (as I did, actually). But where they came from makes no difference -- the point is do they prove what I claim they do, or not? Expecting no attempts at argument, just the usual tired insults................. No-one that I care to discuss it with has asked me. I will not discuss it with an obstinate fool Bi
|
|
|
Post by speakertoanimals on Jan 26, 2011 16:56:46 GMT 1
Ah, so we are forced to conclude you CANNOT disprove my examples then. So you are taking your ball back and running home. Fair enough, but the rest of us will conclude from that what we will.
|
|
|
Post by abacus9900 on Jan 26, 2011 17:02:21 GMT 1
I don't pretend to know anything about the transmission of information but on a quick look it seems to have something to do with sending the maximum amount of information (originally over the telephone) in terms of 'bits' while reducing the random interference or 'noise.' Did Shannon find the optimum method for achieving this?
|
|
|
Post by speakertoanimals on Jan 26, 2011 17:11:20 GMT 1
Yes. It describes how to get maximum information down a noisy channel. Also, how much information is actually in a signal or data, which is why lossless data compression software such as Winzip work. And why data compression for pictures, video files, and music tracks works. All rather significant in todays world, we'd be stuffed without it. So hopefully SOME people do understand it, even if some who think they do don't.........................
|
|
|
Post by abacus9900 on Jan 26, 2011 17:34:33 GMT 1
Yes. It describes how to get maximum information down a noisy channel. Also, how much information is actually in a signal or data, which is why lossless data compression software such as Winzip work. And why data compression for pictures, video files, and music tracks works. All rather significant in todays world, we'd be stuffed without it. So hopefully SOME people do understand it, even if some who think they do don't......................... Right, fine, so far so good. But how can information be represented by just a string of zeros?
|
|
|
Post by Progenitor A on Jan 26, 2011 17:37:20 GMT 1
I don't pretend to know anything about the transmission of information but on a quick look it seems to have something to do with sending the maximum amount of information (originally over the telephone) in terms of 'bits' while reducing the random interference or 'noise.' Did Shannon find the optimum method for achieving this? Yes, basically that is it. It is optimising a transmission channel (by suitable encoding) to get the maximum amount of information down a channel C with the minumum of errors. in a 'noisy' environment In order to do that, Shannon analysed the nature of information and from that analysis, we know just how much 'information' is contained in a message source - he performs this analysis on binary messages - most message sources are not binary but we can easily convert them to a binary signal as any information source can be changed into a binary signal. So by examining a binary sequence of digits we can calculate the information content of that sequence and hence know the channel capacity C necessary to transfer that information from between two (or more) places. Thus an information string of all zero's or all 1's or any variation between these two can be analysed for its information content. Now, if you wish me to go further I can do so and then you will be able to calculate the information content of any binary sequence of any finite length. Note that 'Information' in this context is devoid of meaning - for example if an encrypted message is converted to a binary sequence, we will be able to calculate the amount of information in the sequence, but we will not have a clue what the information is saying or means!
|
|
|
Post by Progenitor A on Jan 26, 2011 18:00:05 GMT 1
Right, fine, so far so good. But how can information be represented by just a string of zeros? Lets consider a computer store holding eight bytes of information - 64 bits of information. How much information is contained in the 64 bits? Shannon introduced the concept of entropy of information[ which is very close in concept to the entropy of thermodynamics. In information entropy we have variance on a scale 0 to 1 where 1 indicates a maximum of information and 0 absence of information Taking our binary sequence of 64 bits, Shannon tells us that we can calculate the entropy with the equation: H(entropy)= -(p o log p o + p 1 log p 1) bits per symbol How do we apply this to our binary sequence of 64 bits? Well, count up the number of 0's Lets say this comes to 32 Of course then ther must also be 32 1's The probabilty p o of 0's is 0.5 The probability p 1 of 1's is also 0.5 The entropy of our signal is then H=-(0.5 log(.5) + 0.5 log(.5))[ note: logarithms base 2 are used] H = -(-0.5 + -0.5) = 1 bit per symbol The information content of the signal is the number of symbols x EntropySo the information content of our 64-bit signal with equal number of 1's and 0's is 64 bits, and if we read them out in 1 sec the required channel size C is 64bts per sec. I will leave it here and if you are interested we can examine the information content when all the bits are 0,or 1 or any desired combination. (hope this answers your question - re-reading your question I do not think that it does!)
|
|