Learn Before
Recurrent Network as a Cache Mechanism
A recurrent network can serve as a memory cache by maintaining a fixed-size memory state that is updated at each time step. As illustrated, at step i, the current key-value pair, denoted as , is combined with the previous memory state, , through an Update function. This function, which can be a recurrent neural network, produces a new memory state, Mem. This mechanism compresses the entire history of key-value pairs into a constant-size memory component (e.g., size 1x2), making it an efficient caching strategy. The process is defined by the recurrent formula:

0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
General Formula for Recurrent Memory Update
Cumulative Average of Keys and Values for Memory Component
Recurrent Network as a Cache Mechanism
A system is designed to process an extremely long, continuous sequence of information. To manage this, it uses a memory cache that is updated at each step: a new key-value pair is combined with the entire compressed memory from the previous step to form a new, equally compressed memory state. What is the primary trade-off inherent in this design?
A system maintains a fixed-size memory cache by processing a sequence of key-value pairs one at a time. Arrange the following events in the correct chronological order for a single update step.
Memory Cache State Calculation
Learn After
Critique of a Compressive Memory System
A memory system is designed to process a long sequence of key-value pairs. At each step, it updates a single, fixed-size memory state using the formula:
New_Memory = Update_Function(Current_Pair, Previous_Memory). What is the most significant trade-off inherent in this design?Applying a Recurrent Cache to User Session Tracking