Concept

Root Mean Square (RMS) Layer Normalization

Root mean square (RMS) layer normalization is an alternative to standard layer normalization that focuses solely on re-scaling the input vector, entirely omitting the re-centering step. This streamlined normalization technique is widely implemented in large language models (LLMs), notably including the LLaMA series.

Image 0

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Related
Learn After