A natural language processing research team in the early 2010s is deciding how to allocate a significant budget increase. They can either use it to increase their training dataset size by a factor of ten or to fund a project to design a more complex model architecture. Based on the conventional wisdom of that era, which of the following arguments would most persuasively justify choosing the architectural project over the massive data increase?
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Evaluation in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
AI Research Funding Decision
A natural language processing research team in the early 2010s is deciding how to allocate a significant budget increase. They can either use it to increase their training dataset size by a factor of ten or to fund a project to design a more complex model architecture. Based on the conventional wisdom of that era, which of the following arguments would most persuasively justify choosing the architectural project over the massive data increase?
The conventional wisdom in natural language processing, prior to the advent of very large models, held that continuously increasing the size of a model and its training data would lead to proportionally consistent and predictable improvements in performance.