|
Post by carnyx on Jan 24, 2011 15:45:29 GMT 1
STA,
By not reading what you quoted properly, you gave one of your immoderate Wiki-fuelled rants worthy of Hypatia/Toad at her worst!
Had you worked out under what conditions the quote would be valid, you would have agreed that higher frequencies have a greater information capacity.
And maybe, that you would have realised that 'white noise' represents the ultimate in information capacity
(And that this is also true in spatial terms, as well)
|
|
|
Post by speakertoanimals on Jan 24, 2011 15:46:35 GMT 1
Utter bollocks! GO read something on information theory, for starters!
And easily disproved -- because when transmitting EITHER string, if I'm saying 'change' then I also have to say 'no change' when appropriate, either of which takes 1 bit.
Hence 0 (no change) (no change) (no change)........... is N bits, just as 1 (change) (no change) (change) (change) (no change) (no change) and so on is also N bits.
simple really, you forgot the sending 'no change' -- you do have to SEND it can't be assumed.
You have absolutely NO IDEA what you are talking about, and instead have just made-up what seems plausible to you, and got it totally wrong. Its the difference between information as encoded in information theory, and 'information' in terms of everyday commonsense (which gets it totally wrong, as commonsense often does!).
|
|
|
Post by Progenitor A on Jan 24, 2011 16:06:57 GMT 1
Utter bollocks! GO read something on information theory, for starters! And easily disproved -- because when transmitting EITHER string, if I'm saying 'change' then I also have to say 'no change' when appropriate, either of which takes 1 bit. Hence 0 (no change) (no change) (no change)........... is N bits, just as 1 (change) (no change) (change) (change) (no change) (no change) and so on is also N bits. simple really, you forgot the sending 'no change' -- you do have to SEND it can't be assumed. You have absolutely NO IDEA what you are talking about, and instead have just made-up what seems plausible to you, and got it totally wrong. Its the difference between information as encoded in information theory, and 'information' in terms of everyday commonsense (which gets it totally wrong, as commonsense often does!). Well my dear, I have been working with communication systems all my life and it is fundamental to information theory that infornation is contained in changing signals. The sampling theorem for converting an analogue signal to a digital signal must measure (or limit) the changes in order to work properly. And if you think that modern sophisticated communication systems bother sending a digital stream when the encoding (of an eight bit number for example) is 00000000 then you are demonstarting (once again) your ignorance. In modern communication systems when there is no change in the encoded signal nothing is sentBecause then , my dear, the communication system is more efficient (that is what Shannon was seeking - how to send information in the smallest possible bandwidth) and because nothing is sent when the encoding is 00000000 then more than one telephone conversation can be sent on one transmission line. Indeed some systems disconect the user from a telephone line when the user is not speaking, reconnecting them as soon as they start to speak. You will not know of that system. It is based on Shannon's Information theiry Telemetry SCADA system would be overloaded if they sent when the encoded signal was zero. Telemetry systems only send when there is a change in the monitored parameter
So do not address me with your ignorant bluster and insults Once more you are demonstarting an ignorance that shows that you are not a physicist.
|
|
|
Post by speakertoanimals on Jan 24, 2011 16:26:19 GMT 1
Still more nonsense, and those who can't distinguish between digital information, and analog bandwidth nonsense!
That isn't what is meant by 'information' within the context of information theory! your first mistake, in taking what you know that uses the word 'information' and assuming that means the same as the 'infomation' in 'information theory. It isn't the same.
Except we were only talking about the information in DIGITAL signals.
And you have totally failed to understand that sending NOTHING isb the same as sending something.
Lets take a simple example. Suppose we have agreed beforehand we are sending two bits --either 00, 01, 10 or 11
Okay, I'll send 0 then stop sending to mean 00 (no change in your jargon). Then 1 then stop sending means 11.
But what about 01? I have to send 0, to say that starts with 0 (as opposed to 10), and then change.
but if I receive 0, and nothing else, how do I know that this was what was meant, or whether or not someone just cut me off? I don't unkless when you send 0 (and nothing else), you also say -- message ends, that's your lot. Which you might as well send as 00 (2 bits), to ensure it is distinguishable from (01), or 0 then getting cut-off!
Hence your send nothing else argument , is nonsense.
Because you have forgotten that what Shannon was talking about was probabilties of varous events, which are going to be different for telephones, that for truely random bit strings. Plus you are talking about a different case entitrely, where you have more than 1 message going down the same line.
As good as you may be when it comes actual systems, you have totally failed to understand the mathematics than shannon was delaing with. As your repeated claim that 0000 contains no information repeatedly shows.
Go read the webpage I suggested, it will show you you're wrong, since you seem disinclined to listen to a word I say, but just disagree on principle!
|
|
|
Post by carnyx on Jan 24, 2011 16:26:59 GMT 1
STA
An interesting slip?
|
|
|
Post by speakertoanimals on Jan 24, 2011 16:41:52 GMT 1
not at all -- I think you'll find I quite frequently refer to recourse to 'commonsense' as often misleading, often totally wrong.
To take the telephony case, sending something as opposed to silence is a different case to a simple N bit string.
So imagine I have two senders, who may send, or who may go silent. Oviously, if one is going silent (sends in effect 'message ends'), then I can use all my capacity for sender 2, if they are speaking. but when sender 1 resumes (sends 'message starts'), I then have to reduce capacity for sender 2, so both get sent.
Which is a totally different case to the simple N bit string that HAS to be sent, I need to know the difference between 00000, 00001, and 1000. Taking 0 to be the same as message ends or no message, what do we then do with 11111?
Anyway, all this message ends stuff, or no change, is just trying various ways of 'ENCODING' the possible bit strings. And if mathematically, the task is sending a string of N bits, all possible strings equally likely, then there is no better way than just sending the N bits. When we have multiple senders, and senders going silent or not, that is a DIFFERENT case, and not to be confused with sending the string 00000000.
|
|
|
Post by speakertoanimals on Jan 24, 2011 16:54:39 GMT 1
Except we were talking digital information, not analog.
And we were talking about the information content of a specific digital message, where you have confused the 00000 digital case with the 'no send' analog case.
I really should get paid for tracking down the source of your mistakes and misunderstandings...............................
And you shouldn't confuse the shannons source coding theorem with the noisy channel coding theorem............................
|
|
|
Post by Progenitor A on Jan 24, 2011 17:10:13 GMT 1
Still more nonsense, and those who can't distinguish between digital information, and analog bandwidth nonsense! You really are a complete fool. My work entailed lecturing postgraduates (PhDs)on information theory in telecommunications systems! Perhaps you would like to explain to me what analogue bandwidth is necessary to send a CCITT 3.1kHz bandwidth standard digitised analogue telephone signal and outline some common methods of reducing the required analogue bandwidth such as in GSM and 3G systems including optimising-encoding and eror correcting techniqes including Forward Error Correction, Block encoding and redundancy techniques as covered by Shannon and others? That isn't what is meant by 'information' within the context of information theory! your first mistake, in taking what you know that uses the word 'information' and assuming that means the same as the 'infomation' in 'information theory. It isn't the same. Idiotic, ignorant ranting gobbledeygook. Shannon was directly concerned with information in telecommunications systems. That was the purpose of his research! Except we were only talking about the information in DIGITAL signals. Are WE? Shannon wasn't - he was talking about the information content of analogue and digital signals. And the information in most digital signals sent over the telecommunications system is dependent upon the chracteristics of a digitised analogue signal, such as the voice signal, or digitised analogue images, or text messages And you have totally failed to understand that sending NOTHING isb the same as sending something. IDIOT. In the post you are quoting from I have gone to some length to explain to you that modern communications systems (that is what concerned Shannon) do just that -send nothing when there is nothing to send that is when when the encoder output is 00000000 Hence your send nothing else argument , is nonsense. Modern communication systems based upon Shannon's Information Theory must then be nonsense musn't they? You really are exposing your absolute ignorance and stupidity Because you have forgotten that what Shannon was talking about was probabilties of varous events, which are going to be different for telephones, that for truely random bit strings. Plus you are talking about a different case entitrely, where you have more than 1 message going down the same line. More ignorant ranting gobbledeygook! Shannon was a Telecommunications Engineer at Bell Laboratories and he was concerned with optimising the transmission of information over telecommunications systems by suitable encoding to minimise and correct erors in transmission and minimise the necessaty bandwidth for transmission As good as you may be when it comes actual systems, you have totally failed to understand the mathematics than shannon was delaing with. As your repeated claim that 0000 contains no information repeatedly shows. You have no idea how good I am with systems. You do not have a clue about telecommunications systems that are designed upon the principles that Shannon laid down in his Information Theory You are a waffling buffoon that exposes your ignorance whenever you venture into fields where other have some expertise. Like right now You are no more a physicist than I am an atstronaut, You simply spout nonsense paraphrasing what you have just read without understanding it
|
|
|
Post by speakertoanimals on Jan 24, 2011 17:34:45 GMT 1
Which is slightly DIFFERENT to shannons source coding theorem, go look it up! Nope, because we were talking about the information content of binary strings. Except to do that, he needed to be able to compute the information contant of a binary string (result I used), as well as talking about probabilities in order to compute the optimum codeword length. Either of which doesn't give zero information content or zero codeword length for the transmitting the string of N zeros (as compared to any other binary string of length N). You are getting increasingly desperate, maybe because you HAVE looked stuff up, and realised that there is more to information theory than the bit you knew about....................... Except to0 do that (optimum encoding), he had to be able to compute the information content of a binary string -- and he DIDN'T get zero for a string of N zeros! As I keep telling you, because that is where we came in, what is the information content of an arbitary binary string of length N. Shannon says N, whatever the string, you see fit to disagree with him. Your problem, not mine. O go read Shannon: plan9.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdfFirst page, Third paragraph: Which is to say, for strings of N binary digits (2^N different ones), Shannons measure of information for any one such message is log(2^N) = N. Hence 0000...0000 contains N bits of information ACCORDING TO SHANNON himself. I rest my case. Perhaps you should make sure you've read and understood it before you teach your next course, it wouldn't do to confuse the students and contradict Shannon himself, would it.......................... -log(1/p) that's the information content according to Shannon, as I said right at the start.
|
|
|
Post by Progenitor A on Jan 24, 2011 17:59:19 GMT 1
Which is slightly DIFFERENT to shannons source coding theorem, go look it up IDIOT AGAIN Coding is part of information theory. The source coding used in comminications systems today is very complex involvong statistical predictive algorithms based upon Shannons theories of information transfer You just say the first thing that coems into your head Nope, because we were talking about the information content of binary strings. More idiocy! A digitised analogue signal is a binary string where the string length is 2x f analogue maxxNo of encoded bit per sample Except to do that, he needed to be able to compute the information contant of a binary string (result I used), as well as talking about probabilities in order to compute the optimum codeword length. Either of which doesn't give zero information content or zero codeword length for the transmitting the string of N zeros (as compared to any other binary string of length N). Sheer ignorant waffle You are getting increasingly desperate, maybe because you HAVE looked stuff up, and realised that there is more to information theory than the bit you knew about....................... Never, in my professional career have I been faced by such dumbness, such ignorance. But that is because the people I taught were post-graduate PhD types not fraudulent imposters Except to0 do that (optimum encoding), he had to be able to compute the information content of a binary string -- and he DIDN'T get zero for a string of N zeros! As I keep telling you, because that is where we came in, what is the information content of an arbitary binary string of length N. Shannon says N, whatever the string, you see fit to disagree with him. Your problem, not mine.[/quote Shannon was intimately concerned with the redundancy contained in information and how removing redundancy can increase the efficiency of transmission. When a digital encoder(typically RPE/LTP) recognises redundant information it does not send it. The typically first redundant information discarded is when the encoder detects 00000000, the crossover point that contains no information. This it does not send. That is using Shannons principle of redundancy. That is how modern communications system swork. You know nothing of modern communications systems I rest my case. Perhaps you should make sure you've read and understood it before you teach your next course, it wouldn't do to confuse the students and contradict Shannon himself, would it.......................... -log(1/p) that's the information content according to Shannon, as I said right at the start. You are the total idiot that judges professionals without having a clue about the systems - in my case information transmission systems - the very thing Shannon was concerned with- that they work upon. I note that in all of your buffoonish responses, you havbe not once asked any quetsions You are a fraud. You are the most ignorant buffoonish idiot to post in these MB's
|
|
|
Post by speakertoanimals on Jan 24, 2011 18:10:34 GMT 1
So, more bluster to try and cover up the fact that you can't actually DISAGREE with what Shannon actually said -- the information content of a string of N binary digits (ANY such string), is N bits according to Shannon. whether that string is 0111000110000 or all zeros.
Perhaps you need to go re-read shannon before you teach your next course, or explain to me why Shannon is wrong about the information content of the string 00000000000?
So, do you still (in direct contradiction to Shannon) maintain that the information content of 00000 is zero?
Perhaps you really do need to bruch up on the fundamentals of the subject you purport to teach, else those poor bloody students are getting a bum deal -- a lecturer who doesn't actually understand what he is supposed to be teaching......................
|
|
|
Post by speakertoanimals on Jan 24, 2011 18:16:15 GMT 1
Actually, it isn't, provided you understand the basics!
I think you need to go re-read SHannon entropy and information content...........................
Or go read the first page of the Shannon paper I gave the link to earlier.
|
|
|
Post by Progenitor A on Jan 24, 2011 18:16:29 GMT 1
So, more bluster to try and cover up the fact that you can't actually DISAGREE with what Shannon actually said -- the information content of a string of N binary digits (ANY such string), is N bits according to Shannon. whether that string is 0111000110000 or all zeros. Perhaps you need to go re-read shannon before you teach your next course, or explain to me why Shannon is wrong about the information content of the string 00000000000? So, do you still (in direct contradiction to Shannon) maintain that the information content of 00000 is zero? Perhaps you really do need to bruch up on the fundamentals of the subject you purport to teach, else those poor bloody students are getting a bum deal -- a lecturer who doesn't actually understand what he is supposed to be teaching...................... I have said repeatedly that in a digital encoder the output 00000000 is redundant, contains no information and can be ACTUALLY IS discarded. In other contexts it may have value as an information string, but if we have an 8-bit repetitive string such as 00000000 00000000 00000000 00000000, in other words if there is no change between the 8-bit strings, then 00000000 00000000 00000000 contain no new information, are redundant and can be discarded. Not only can be, but actually are discarded But enough of your ignorant half-digested ranting. You are simply a fraud
|
|
|
Post by Progenitor A on Jan 24, 2011 18:25:08 GMT 1
Actually, it isn't, provided you understand the basics! Perhaps you would like to explain the simple basics of an RPE/LTP encoder as used in GSM systems? Stop waffling you ignoramus Your effots to explain the calculus showed you to be a fool. I look forward to more waffle after you have googled it
|
|
|
Post by speakertoanimals on Jan 24, 2011 21:15:44 GMT 1
The only fraud is you, because you are failing to THINK!
The basic shannon result applies (if you read Shannon), to the case where successive bits are independantly and identically distributed. For specific signals (such as a telephone signal) this is not the case.
Lets consider the following. Suppose we have 3 detectors or meters or somesuch, A B and C, whose outputs can be 0 or 1.
We hence have eight possible events, 000, 001, 010, 100, 011, 101, 110, and 111.
IF these are independantly and identically distributed, then the probability for any one is 1/8, which can be encoded in 3 bits (the original thing, of course!).
If we have a signal where 000 occurs a lot of the time, then we have to revise these probabilities accordingly. The more frequent 000 is, the shorter the code word we can use.
Now add a second complication -- if 000 occured 50% of the time but with no temporal correlation, than using -log p would be the best we could do. But suppose instead it occured in chunks of all zeros (no signal in your sense). Then we can do better than the -log p result. in fact, we end up with using end message, start message, and only encoding the other 7 possible configurations.
This is in effect more like your telephony signal, but the difference between the raw information content of 00000000 and in this case is because the real signal doesn't obey the basic constraints in the original shannon result -- that successive states are independantly and identically distributed, with no temporal correlation.
So, we can conclude that correlations, as we might expect, REDUCE the actual information content of the signal. But that doesn't remove the fact that in the iid case with no correlations, the information content of a length N binary message is just N, whether they are all zero, or not.
And I see you have grudingly almost admitted this, with your statement that:
You have, I think allowed an engineers bias to enter, and forgotten the basics of information from a mathematical point of view. And I think you have allowed your personal bias against me to stop you considering what I actually said.
So, I return to my original statement. If we were sending a message such as the successive tosses of a fair coin (no temporal correlation), then N is the real information content, even if those N tosses are all zero. Just because for other types of messages with correlations, 000000 may mean something else (the existence of correlations which reduce information content) doesn't negate that result.
Another good example is binary images -- based on the shannon result, we would say information is number of pixels, N. Except if you generate truely random images, they don't look much like actual binary images, because for binary images derived from real images (by processes such as thresholding a grayscale image), there are significant spatial correlations, because real images tend to be of objects, which are blocks of 1's. Hence actual information content can be less than N.
|
|