|
Post by Progenitor A on Feb 5, 2011 21:38:50 GMT 1
I hope that wasn't too much in one go Abacus.
Let's stay with the example that STA gave earlier, just for a short while Coffee (probability) 3/8, Cereal 1/4, Nothing 1/4, Tea 1/8
Here is a Huffman coding for each word:
Coffee 1 Cereal 01 Nothing 100 Tea 000 (the way Huffman coding is done is fascinating and I can show you that separately if you like)
Now here is how Breakfast might be ordered over 8 days Day 1 Coffee code 1 Day 2 Coffee Code 1 Day 3 Coffee Code 1 Day 4 Cereal Code 01 Day 5 Cereal Code 01 Day 6 Nothing Code 001 Day 7 Tea code 000 Day 8 Nothing Code 001
Note how the lowest probability word has the highest number of information digits to be transmitted We can work out the entropy by multiplying the number of digits needed to transmit each Item multiplied by the probability Thus: Coffee 1 digit x probability 3/8 = 3/8 Cereal 2 digits x probability ¼ = ½ Nothing 3 digits x probability ¼ = ¾ Tea 3 digits x probability 1/8 =3/8
And adding the rh column gives 16/8 = 2
That is we need 2 digits per symbol to transmit this information. This is close to the Shannon absolute entropy of 1.9.. we found above and will get nearer as the size of the code word increases. (for more words to be encoded)
There are other fascinating aspects of the Huffman code, if you wish to delve further
|
|
|
Post by abacus9900 on Feb 5, 2011 22:22:36 GMT 1
|
|
|
Post by Progenitor A on Feb 6, 2011 10:07:46 GMT 1
Hi Abacus, I guess that's enough of this information theory for you? Don't blame you I am a bit difficult to shut up on this If you want to continue after a rest let me know Anyway I will shut up until you ask me a question.
|
|
|
Post by abacus9900 on Feb 6, 2011 12:03:00 GMT 1
I think I'd better have a rest naymissus because it's a lot to take in and really I should be writing this down. I will come back with points that confuse me, ok, but thank you for your time and effort.
|
|
|
Post by Progenitor A on Feb 6, 2011 13:14:18 GMT 1
I think I'd better have a rest naymissus because it's a lot to take in and really I should be writing this down. I will come back with points that confuse me, ok, but thank you for your time and effort. No problem Bit heavy for those thaty only have a casual interest It has been nice to practice any explantory skills that I have Thank you!
|
|
|
Post by principled on Feb 7, 2011 13:26:46 GMT 1
Naymissus and STA Very interesting discussion, only wish I'd gone down the electrical/electronic route rather than mechanical! Anyway, coming back to entropy, errors etc., I got to thinking about VOIP. I use a certain VOIP company (S..PE), frequently. It is common for the picture to freeze but the voice to continue. Now is this because- as you (or STA) said- you can have errors in a voice message and still relay the essential information thus allowing the voice to continue, which presumably wouldn't be the case with a picture, or is it more to do with bandwidth? Anyway, an explanation of what is going on would be good. P
|
|
|
Post by Progenitor A on Feb 7, 2011 13:29:31 GMT 1
Just a final sign-off on this subject that might just whet your appetite Abacus. No need to respond
If we go to a more expensive hotel that offers more choice for breakfast, and stay there for 16 days (we must stay there for 16 days if one of the choices has a probability of 1/16) then we get into a more interesting scene (the more choices there are the more interesting the coding)
If there are 8 choices, for example then common sense will tell us that we need 3 bits to send any one choice down to the kitchen 23 =8. We must (common sense tells us) send no more than 3 bits per choice. However, if we want the choices in the proportions below, is it still 3 bits per choice or can we send less? Choice & Probability (1/16 = 1 days breakfast) A 1/8 B 1/16 C 1/4 D 3/16 E 1/16 F 3/16 G 1/16 H 1/16 Now Shannon tells us that we will need 2.78 bits per choice (on average) to send this information to the kitchen if the choices do not have equal probability but the probabilities are as shown
Then we can devise a Huffman coding for each choice based on its frequency of occurrence. (ask if you want the Huffman coding)
The Huffman code will get near to the absolute Shannon value of 2.78 bits per choice, and the longer the code word for each choice (if we had more choices)the nearer we get to the Shannon value!
The question is, could this coding be used by any room in the hotel?
|
|
|
Post by speakertoanimals on Feb 7, 2011 13:58:39 GMT 1
So, with more options, is this Naymissus finally conceeding that a string of zeros CAN carry information........................
Waiting, expecting no answer. Glad to see that Naymissu has finally got the hang of the entropy of the source though, let's see if he can get as far as using the entropy of a message to estimate the entropy of the source.........................
|
|
|
Post by Progenitor A on Feb 7, 2011 14:31:07 GMT 1
Naymissus and STA I got to thinking about VOIP. I use a certain VOIP company (S..PE), frequently. It is common for the picture to freeze but the voice to continue. Now is this because- as you (or STA) said- you can have errors in a voice message and still relay the essential information thus allowing the voice to continue, which presumably wouldn't be the case with a picture, or is it more to do with bandwidth? Anyway, an explanation of what is going on would be good. P Two things are going on here 1. You are right about bandwidth.The picture does, in general require more banwidth than the voice signal. Thus should not matter in particular (indeed in some cases bandwidth is increased to improve the reliability of communications). But in VOIP the bandwidth is not always there! The basis of VOIP protocol is to 'grab' the necessary bandwidth for as long as it is available. This is called statistical multiplexing (multiplexing is putting more than one customer on one channel), and Internet VOIP cannot guarantee that the bandwidth will be available when it is needed. If that happens your computer retains the previous information and continues showing that until new information comes along (when bandwidth is available to send it) It is a limitation of the Internet. Professional Communications Companies also use VOIP, but they guarantee that the bandidth will be there when it is needed, and hence no 'freezing' will occur. Of course as a voice signal demnds less bandwisth it is more likely to get it! 2. The second point you make is also a reason. If interference occurs on the voice channel, then, in general, the nosy information is passed on anyway and we get a distorted/noisy voice channel. The brain-ear combination is so remarkable that a high degree of noise and distortion can be tolerated in a voice channel. But in video information we attempt to correct any received errors before displaying the picture (this can lead to out-of-sync voice and picture), because if we do not, then the picture will look horrible and the eye is not as tolerant as the ear. So there we have the second reason. Errored voice signals are passed anyway, errored video signals , if they cannot be corrected, are rejected, and the previous 'good' picture is shown until an new picture without errors is received
|
|
|
Post by striker16 on Nov 19, 2011 12:19:40 GMT 1
Without going into all the mathematical complexity (I'm not much good at maths), are you saying that when a signal is corrupted by noise during transmission you can calculate what the probable signal should be based on probabilities? And that information (in the context of information theory) is information of unexpected events? But what if information is badly corrupted? Surely, in that case it is much more difficult to work out what it should have been, isn't it?
|
|