Relation

loss function/negative log likelihood loss

Assume that only one class is the correct one and that there is one output unit in y for each class. y is a one-hot vector. The cross-entropy loss is simply the log of the output probability corresponding to the correct class.

LCE(yˆ,y)=logexp(zi)sum(j=1,k,exp(zj))LCE(yˆ, y) =\frac{−log exp(zi)}{sum(j=1,k, exp(zj))}

0

1

Updated 2021-11-04

Tags

Data Science