Information content and entropy

A fundamental problem in information theory is to find the minimum average number of bits needed to represent a particular message selected from a set of possible messages. Shannon solved this problem by using the notion of entropy. The word entropy is borrowed from physics, in which entropy is a measure of the disorder of a group of particles. In information theory disorder implies uncertainty and, therefore, information content, so in information theory, entropy describes the amount of information in a given message. Entropy also describes the average information content of all the potential messages of a source. This value is useful when, as is often the case, some messages from a source are more likely to be transmitted than are others.


Понравилась статья? Добавь ее в закладку (CTRL+D) и не забудь поделиться с друзьями:  



double arrow
Сейчас читают про: