Concept

Extending k-NN Datastore Context with a Training Dataset

An alternative to using only the current sequence for context is to populate the k-NN datastore with key-value pairs from a larger collection of sequences, such as an entire training dataset. This approach enables a Large Language Model to leverage a more generalized context for making predictions.

0

1

Updated 2026-04-23

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences