Learn Before
Compositional Generalization in NLP
Compositional generalization is the advanced capability of an NLP system to understand and generate novel combinations of familiar components by applying learned rules to new, unseen data. This ability is crucial for robust language understanding but presents a significant challenge compared to tasks with simple compositionality. The need for improved compositional generalization in models like LLMs drives research into more sophisticated problem decomposition methods.
0
1
Tags
Ch.3 Prompting - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Semantic Parsing as an Example of Compositionality
Compositional Generalization in NLP
Evaluating a Language Processing Approach
An engineering team is building a system to interpret complex user requests for a smart home assistant. For the request, 'Set the bedroom thermostat to 72 degrees and then dim the lights,' which of the following system designs most closely follows the principle that a complex expression's meaning is determined by the meanings of its parts and the rules for combining them?
Semantic Parsing
Diagnosing a System Failure in Language Understanding
Learn After
Compositional Reasoning Tasks for LLMs
Diagnosing a Model's Language Limitation
An NLP model is trained on a dataset of commands. The training data includes 'walk left', 'walk right', 'run left', 'run right', 'jump twice', and 'jump three times'. The model performs perfectly on these commands. However, when tested on the new, unseen command 'jump left', the model fails. What does this failure most likely indicate about the model?
The Challenge of Novelty in Language Models