Learn Before
k-NN as a Popular Retrieval-Based External Memory Method
The -nearest neighbors (-NN) algorithm is a popular retrieval-based method for implementing external memory. It operates by creating a datastore of key-value pairs, with each pair representing a specific context state. The concept of 'context' is flexible and not restricted to a single sequence's history; it can be as broad as an entire dataset. This allows for retrieving the most relevant context from a large collection of sequences, not just the current one. When presented with a new query, the model uses -NN to find the most similar context representations from the datastore to enhance its predictions.

0
1
References
Reference of Foundations of Large Language Models Course
Reference of Foundations of Large Language Models Course
Reference of Foundations of Large Language Models Course
Reference of Foundations of Large Language Models Course
Reference of Foundations of Large Language Models Course
Reference of Foundations of Large Language Models Course
Reference of Foundations of Large Language Models Course
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Using Retrieved Context to Improve Attention
Retrieval-Based Methods as a Solution for Long-Context Processing
Unsuitability of External Memory for Streaming Contexts
k-NN as a Popular Retrieval-Based External Memory Method
Computational Cost of External Memory Models
Architectural Design for a Real-Time Chat Application
A company is building a question-answering system to help employees query a massive, static knowledge base of over 100,000 internal documents. The core language model has a fixed input size that is much smaller than the total size of the knowledge base. Which approach is the most effective and scalable for ensuring the model can access the necessary information to answer specific user queries accurately?
Evaluating the Use of External Memory Systems for LLMs
Augmented Input Formula for External Memories
Comparison of External Memories in LLMs
Learn After
k-NN Memory Retrieval
Integrating k-NN Memory with Local Memory in Attention
Populating a k-NN Datastore for Language Modeling
Equivalence Between k-NN and Sparse Attention Models
k-NN Language Modeling (k-NN LM)
Vector Database
A language model is designed to be a question-answering assistant for a large corporate knowledge base containing thousands of separate project documents. A user asks a question about 'Project Alpha,' but the most relevant technical detail needed to answer it is located in a document for 'Project Zeta,' a completely unrelated past project. Which statement best explains the unique advantage of using a k-nearest neighbors (k-NN) based external memory system in this scenario?
Analyzing Long-Range Consistency in Language Models
In a k-NN based external memory system, the datastore of key-value pairs is limited to representing only the context states from the current, single sequence being processed.