Learn Before
Concept

GeGLU (GELU-based Gated Linear Unit)

GeGLU is a specific variant within the Gated Linear Unit (GLU) family of activation functions. It is formed when the internal non-linear activation function, 蟽(路), in the general GLU structure is defined as the Gaussian Error Linear Unit (GELU) function.

0

1

Updated 2026-04-21

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences