In the context of language modeling, the terms 'word' and 'token' are often used interchangeably because a token is always equivalent to a single, complete word.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Comprehension in Revised Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Analyzing a Technical Miscommunication
In a technical blog post introducing a new language model, the author writes: 'For simplicity in this overview, we will treat 'words' and 'tokens' as interchangeable units of text.' What is the most accurate interpretation of the author's statement?
In the context of language modeling, the terms 'word' and 'token' are often used interchangeably because a token is always equivalent to a single, complete word.