asked 11.3k views
1 vote
For example, X is a random integer from 1 to 16. Now I get a piece of information: X is 3, 5, 9, or 14. This has 2 bits of information for the knowledge about X. But if the list of options is random enough, I'd have to list all four integers to describe this knowledge, which is much more than 2 bits. It may need more information, if the probability distribution is not uniform, and I attach a probability to each of the options. Is there a term for this phenomenon?

asked
User Efel
by
8.4k points

1 Answer

5 votes

Final answer:

The phenomenon being described is related to the concept of entropy in information theory. The more uncertain or unpredictable a random variable is, the higher its entropy.

Step-by-step explanation:

The phenomenon being described here is related to the concept of entropy in information theory. In information theory, entropy measures the amount of information or uncertainty in a random variable. The more uncertain or unpredictable the random variable, the higher its entropy.

In this case, the student is noticing that if the list of options for the random variable X is more random and diverse, it would take more bits of information to accurately describe the knowledge about X. The student also mentions that if the probability distribution is not uniform, meaning that some options are more likely than others, then even more information would be needed to describe the knowledge about X.

Therefore, the term for this phenomenon is the concept of entropy in information theory.

answered
User Enfyve
by
8.3k points

No related questions found

Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.