Definition:

  • Measures impurity

Sample Entropy:

  • Measures impurity of a sample
    • the number of expected bits needed to encode a randomly drawn value of (under most efficient code)
    • represents the number of bits needed to encode the event “X=i” if it were a binary event. \
      • For example, if the probability is 1/4, then log2​(1/4)=−2, implying that 2 bits are needed to encode this event.
    • calculates the expected number of bits needed to encode the event “X=i”.
    • Sums up the expected number of bits for all possible values of X

Conditional Entropy:

  • Of a variable conditioned on a random variable
    • by def

Information Gain:

  • Decrease in entropy (less random) after knowing