Short Answer

Interpreting Training Metrics

A machine learning practitioner is debugging a training process. They notice that the error value printed after every single data point fluctuates wildly, but the error value printed at the end of each complete pass through the dataset is steadily decreasing. Explain why these two error values behave differently, using the correct terminology for each.

0

1

Updated 2025-10-05

Contributors are:

Who are from:

Tags

Data Science

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science