Entropy is a concept in physics and information theory that measures the amount of disorder or uncertainty in a system. In information theory, entropy is a measure of the amount of information contained in a message or data set.
Information Theory
Information theory is a branch of mathematics that deals with the quantification, storage, and communication of information. It was developed by Claude Shannon in the 1948, and has since been applied in many fields including communication systems, data compression, cryptography, and machine learning. The central concept in information theory is entropy, which measures the amount of uncertainty or randomness in a system. Other important concepts include mutual information, channel capacity, and error-correcting codes.