Learn Before
Concept

General equation of n-gram model

conditional probability for the next word: P(wnw1:n1)P(wnwnN+1:n1)P(w_n|w_{1:n-1}) \approx P(w_n|w_{n-N+1:n-1})

probability of a complete word sequence: P(w1:n)Πk=1nP(wkwkN+1:k1)P(w_{1:n}) \approx \Pi^n_{k=1}P(w_k|w_{k-N+1:k-1})

0

1

Updated 2022-07-04

Tags

Deep Learning

Data Science

Related