Learn Before
Analysis of a Custom Activation Unit
Based on the implementation described in the case study below, identify the specific name of the activation unit and justify your identification by explaining how the components and their arrangement fit the definition of that unit.
0
1
Tags
Ch.2 Generative Models - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Analysis in Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
GeGLU (GELU-based Gated Linear Unit) Formula
Applications of GeGLU in Large Language Models
An activation function is constructed by taking an input, applying two separate linear transformations to it, and then combining the results. One transformed output is passed through a non-linear 'gating' function, and the result is then multiplied element-wise with the other transformed output. For this entire structure to be correctly identified as a GeGLU, what must be true about the gating function?
Analyzing a Gating Mechanism
Analysis of a Custom Activation Unit