Learn Before
Debugging Model Input Formatting
Based on the case study below, identify the critical error in the input formatting that is likely causing the model's poor performance and explain why it is an error.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Example of Token Masking in a BERT Input Sequence
Example of an Unchanged Token in a BERT Input Sequence
Example of Random Token Replacement in a BERT Input Sequence
A language model is designed to process pairs of sentences by concatenating them into a single sequence. This model requires a special token at the beginning of the entire sequence to be used for classification tasks, and another special token to mark the boundary between the two sentences and to signify the end of the sequence. Given the two sentences 'The sky is blue.' and 'The grass is green.', which of the following options correctly formats them as a single input sequence for this model?
Debugging Model Input Formatting
Analyzing Input Sequence Structure