Learn Before
Lack of lexical conditioning in PCFGs
CFG rules don’t model syntactic facts about specific words, leading to problems with subcategorization ambiguities, preposition attachment, and coordinate structure ambiguities. Words do play a role in PCFGs since the parse probability includes the probability of a word given a part-of-speech. Depending on how these probabilities are set, a PCFG will always either prefer NP attachment or VP attachment. As it happens, NP attachment is slightly more common in English, so if we trained these rule probabilities on a corpus, we might always prefer NP attachment, causing us to misparse this sentence. Probabilistic context-free grammars are incapable of modeling important structural and lexical dependencies.
0
1
Tags
Deep Learning (in Machine learning)
Data Science