Short Answer

Comparing Architectures for Real-Time Data Streams

A company is building a system to perform real-time analysis on a continuous, high-volume stream of data. They are considering two architectural approaches:

  • Approach 1: For each new piece of data, the system re-processes the entire sequence of all data received up to that point to maintain context.
  • Approach 2: The system maintains a condensed, fixed-size summary of the information from past data and updates this summary with each new piece of data.

Which approach is more suitable for this task, and why? Explain your reasoning in terms of computational cost as the stream of data grows over time.

0

1

Updated 2025-10-07

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science