Search This Blog

Artikel Pilihan

Jom Hafal dan Amal Doa Masuk Pasar... 💞

Alhamdulillah.. Dalam Sunnah Rasulullah Saw ada Kejayaan.. Baginda Rasulullah Saw telah ajar banyak doa kepada kita agar kita sentiasa ingat...

Entropy (information theory)

Friday, August 19, 2011

From Wikipedia, the free encyclopedia

In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits. In this context, a 'message' means a specific realization of the random variable.

Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".

Shannon's entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon's source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.

A single toss of a fair coin has an entropy of one bit. Two tosses has an entropy of two bits. The entropy rate for the coin is one bit per toss. However, if the coin is not fair, then the uncertainty is lower (if asked to bet on the next outcome, we would bet preferentially on the most frequent result), and thus the Shannon entropy is lower. Mathematically, a single coin flip (fair or not) is an example of a Bernoulli trial, and its entropy is given by the binary entropy function. A series of tosses of a two-headed coin will have zero entropy, since the outcomes are entirely predictable. The entropy rate of English text is between 1.0 and 1.5 bits per letter,[1] or as low as 0.6 to 1.3 bits per letter, according to estimates by Shannon based on human experiments.[2] >

No comments:

Post a Comment