Learn Before
Computational Cost of Fully Connected Layers
For any fully connected neural network layer with inputs and outputs, the parametrization and computational cost is quadratic, specifically . This high cost can become prohibitive in practical deep learning applications.
0
1
Tags
D2L
Dive into Deep Learning @ D2L
Related
Sample Calculation of Softmax Output Layer
Example of a SoftMax activation transformation
Maximum Probability Decision Rule
Consider a 4-class classification problem where the final layer of a model produces the following pre-activation scores for a single input:
[1.0, 2.0, 1.5, 5.0]. The model then uses an activation function that exponentiates each score and normalizes the results to produce a probability distribution. Without performing the full calculation, which of the following statements best describes the resulting probability distribution?Calculating an Output Probability
Classifier Output Analysis
Computational Cost of Fully Connected Layers