Learn Before
Concept
Perplexity
Using perplexity is a method of measuring quality of a language model. Perplexity of a model θ on an unseen test set is the inverse probability assigned by θ to the test set, which is normalized by the test set length.
Minimizing perplexity is equivalent to maximizing the test set probability in the language model.
0
1
Updated 2022-06-28
Contributors are:
Who are from:
Tags
Data Science