Example: In a coin toss, if the coin is fair (i.e., the probability of getting heads or tails is 0.5), the entropy is 1 bit, which means that one bit of information is needed to represent the outcome of the coin toss.
Entropy in information theory
Entropy is a concept in physics and information theory that measures the amount of disorder or uncertainty in a system. In information theory, entropy is a measure of the amount of information contained in a message or data set.