Machine Learning

Entropy, Cross-Entropy

Naranjito 2021. 3. 31. 11:56
  • Entropy
The level of uncertainty. It must range between 0 and 1.

0      <      entropy      <      1
certain                          uncertain

The greater the value of entropy, the greater the uncertainty for probability,

the smaller the value the less the uncertainty.

reference : towardsdatascience.com/cross-entropy-loss-function-f38c4ec8643e

 

  • Cross-Entropy

How far it is from the actual expected value, is used when adjusting model weights during training. The smaller the loss the better the model.