Learn Before
A sub-layer in a neural network processes an input tensor using a specific architectural pattern. The process involves three key operations: 1) applying the sub-layer's primary function (e.g., self-attention), 2) applying a normalization function, and 3) adding the original input tensor to the result of the primary function (a residual connection). Arrange these three operations in the correct sequence that corresponds to the formula: output = LNorm(F(input) + input).
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
A sub-layer in a neural network processes an input tensor using a specific architectural pattern. The process involves three key operations: 1) applying the sub-layer's primary function (e.g., self-attention), 2) applying a normalization function, and 3) adding the original input tensor to the result of the primary function (a residual connection). Arrange these three operations in the correct sequence that corresponds to the formula:
output = LNorm(F(input) + input).Analyzing a Sub-Layer Implementation
A developer is implementing a sub-layer (e.g., self-attention) within a Transformer block. They need to apply the sub-layer's function
F, a residual connection (adding the originalinput), and a layer normalizationLNormoperation. Which of the following expressions correctly represents the post-norm architectural pattern?