Information Theory

Information theory is a branch of mathematics that deals with the quantification, storage, and communication of information. It was developed by Claude Shannon in the 1948, and has since been applied in many fields including communication systems, data compression, cryptography, and machine learning. The central concept in information theory is entropy, which measures the amount of uncertainty or randomness in a system. Other important concepts include mutual information, channel capacity, and error-correcting codes.