Vectorized Minibatch Softmax Regression
To maximize computational efficiency, the forward pass of a softmax regression model is typically vectorized across minibatches. For a minibatch of inputs containing examples with features, and parameters (weights) and (biases), the unnormalized logits are computed using the affine transformation . The softmax function is then applied rowwise to to yield the normalized class probabilities for the entire batch simultaneously.
0
1
Tags
D2L
Dive into Deep Learning @ D2L
Related
Softmax Function Definition
A vector of raw, unnormalized scores
[1000, 1002, 999]is passed as input to a computational function that converts these scores into a probability distribution. A common technique to prevent numerical errors is to first subtract the maximum value of the vector from every element before applying the main transformation (exponentiation). Why is this subtraction step crucial for handling large input values?Calculating Output Probabilities from Model Scores
A model outputs the following raw, unnormalized scores for three classes:
[2.0, 1.0, 0.1]. If a constant value of 5.0 is added to each of these scores, resulting in a new score vector of[7.0, 6.0, 5.1], how will the resulting probability distribution calculated by the function that converts these scores to probabilities change?Order Preservation of the Softmax Function
Energy-Based View of Softmax
Output Layer of Softmax Regression
Partition Function in Softmax
Vectorized Minibatch Softmax Regression
What is Softmax Regression and How is it Related to Logistic Regression?
The Softmax Function, Simplified
Output Layer of Softmax Regression
Implementation of Softmax Regression Using Numpy
Implementation of Softmax Regression Using Tensorflow
Cross-Entropy Loss for Softmax Regression
Vectorized Minibatch Softmax Regression
Vectorized Minibatch Softmax Regression