Concept

Mathematical Notation in Neural Networks

In neural networks, specific mathematical notation is used to represent different components. Key variables include:

  • XX: The input features.
  • WW: The weight parameter. Its dimensions vary; it can be a scalar (WโˆˆRW \in \mathbb{R}) in simple models or a matrix in multi-neuron layers.
  • bb: The bias parameter. It can be a scalar (bโˆˆRb \in \mathbb{R}) or a vector.
  • aa: The activation value. For example, ai[l]a^{[l]}_i denotes the activation of the ii-th neuron in the ll-th layer.
  • y^\hat{y}: The predicted output value.
Image 0

0

3

Updated 2025-10-12

Tags

Data Science

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Ch.1 Pre-training - Foundations of Large Language Models