Example: In a coin toss, if the coin is fair (i.e., the probability of getting heads or tails is 0.5), the entropy is 1 bit, which means that one bit of information is needed to represent the outcome of the coin toss.
सुखार्थिनः कुतोविद्या नास्ति विद्यार्थिनः सुखम्।
Example: In a coin toss, if the coin is fair (i.e., the probability of getting heads or tails is 0.5), the entropy is 1 bit, which means that one bit of information is needed to represent the outcome of the coin toss.