Example

Example of a Multi-Turn Conversation for LLM Fine-Tuning

A sample multi-turn conversation illustrates how a Large Language Model maintains context across turns. The dialogue proceeds as follows:

  • User: 'Who won the FIFA World Cup 2022?'
  • Assistant: 'Argentina won the FIFA World Cup 2022.'
  • User: 'Where was it held?'
  • Assistant: 'The 2022 FIFA World Cup was held in Qatar.'
  • User: 'How many times has Argentina won the World Cup?'
  • Assistant: 'Argentina has won the FIFA World Cup three times.'

For the model to understand that 'it' in the second question refers to the 2022 World Cup, it must consider the previous turn's context.

0

1

Updated 2026-04-20

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Computing Sciences