A researcher is preparing the sentence 'Neural networks learn patterns' for input into a sequence model. The model's design specifies that a special token must be placed at the very beginning of every input sequence to act as a start-of-sequence marker. Given this requirement, how should the researcher format the tokenized input?
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Ch.1 Pre-training - Foundations of Large Language Models
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A researcher is preparing the sentence 'Neural networks learn patterns' for input into a sequence model. The model's design specifies that a special token must be placed at the very beginning of every input sequence to act as a start-of-sequence marker. Given this requirement, how should the researcher format the tokenized input?
Analyzing a Preprocessing Step
When preparing an input sequence for certain neural network architectures, a special token is conventionally placed at the very beginning to serve as a start-of-sequence marker. This token is denoted as ____.