Constructing a Sentence-Pair Input Sequence
Given two sentences, Sentence A: 'The model was pre-trained.' and Sentence B: 'It requires fine-tuning.', construct the complete, single input sequence that combines them for a sentence-pair task. Your answer should include all necessary special tokens in their correct positions.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Application in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A language model is being prepared for a task that involves understanding the relationship between two sentences. Given Sentence A: 'The model learns patterns.' and Sentence B: 'It then makes predictions.', which of the following represents the correctly formatted single input sequence for the model, using special tokens to delineate the structure?
Input Sequence Formatting Analysis
Constructing a Sentence-Pair Input Sequence
Illustration of Transformer Encoding for Sequence Classification