Definition of Information entropy

1. Noun. (context: information theory) A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters. ¹



¹ Source: wiktionary.com

Information Entropy Pictures

Click the following link to bring up a new window with an automated collection of images related to the term: Information Entropy Images

Lexicographical Neighbors of Information Entropy

informant
informants
informatic
informatical
informatically
informatician
informaticians
informaticist
informaticists
informatics
information
information age
information and communications technology
information bulletin
information centres
information entropy (current term)
information gathering
information integrity
information management
information market
information measure
information model
information modeling
information overload
information processing
information processing system
information retrieval
information return
information science
information sciences

Other Resources Relating to: Information entropy

Search for Information entropy on Dictionary.com!Search for Information entropy on Thesaurus.com!Search for Information entropy on Google!Search for Information entropy on Wikipedia!

Search