How do you measure information entropy?
Entropy can be calculated for a random variable X with k in K discrete states as follows: H(X) = -sum(each k in K p(k) * log(p(k)))
What is information entropy?
Information entropy is a concept from information theory. It tells how much information there is in an event. In general, the more certain or deterministic the event is, the less information it will contain. More clearly stated, information is an increase in uncertainty or entropy.
Can entropy be higher than 1?
Entropy is measured between 0 and 1. (Depending on the number of classes in your dataset, entropy can be greater than 1 but it means the same thing , a very high level of disorder.