Intro

Cross-entropy is a measure from information theory that quantifies how different two probability distributions are: the true distribution (p) vs predicted distribution (q).

  • Can be used for binary or multi-class problems, depending on how you define (p) and (q).
  • Commonly used as a loss function in Neural Network Training