Concept

ReLU (Rectified Linear Unit)

The Rectified Linear Unit (ReLU) is a common choice for the activation function σ()\sigma(\cdot) within the hidden layers of neural networks. It is defined to output the positive portion of its argument. When applied to an input vector h\mathbf{h}, the ReLU function is given by the formula: σrelu(h)=max(0,h)\sigma_{\mathrm{relu}}(\mathbf{h}) = \mathrm{max}(0, \mathbf{h}).

Image 0

0

1

Updated 2026-04-21

Tags

Data Science

Foundations of Large Language Models Course

Computing Sciences

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Related