Learn Before
Relation
loss function/negative log likelihood loss
Assume that only one class is the correct one and that there is one output unit in y for each class. y is a one-hot vector. The cross-entropy loss is simply the log of the output probability corresponding to the correct class.
0
1
Updated 2021-11-04
Tags
Data Science