Machine Learning

Entropy, Cross-Entropy

Naranjito 2021. 3. 31. 11:56
  • Entropy
The level of uncertainty. It must range between 0 and 1.

0      <      entropy      <      1
certain                          uncertain

The greater the value of entropy, the greater the uncertainty for probability,

the smaller the value the less the uncertainty.

reference : towardsdatascience.com/cross-entropy-loss-function-f38c4ec8643e

 

  • Cross-Entropy

How far it is from the actual expected value, is used when adjusting model weights during training. The smaller the loss the better the model.

'Machine Learning' 카테고리의 다른 글

tensorflow  (0) 2022.03.16
Glance ML  (0) 2022.03.11
Support Vector Machine, Margin, Kernel, Regularization, Gamma  (0) 2021.03.30
Softmax  (0) 2021.03.24
Logistic Regression, Sigmoid function  (0) 2021.03.17