Learn Before
Relation
List of Common Hyperparameters in Deep Learning
Hyperparameters related to neural network structure:
- Number of hidden layers (Depth)
- Number of hidden units (Width)
- Dropout method
- Activation function for each layer
- Weights Initialization
Hyperparameters related to training algorithm:
- Learning rate
- Momentum parameter
- Number of Gradient descent iterations
- Mini-batch size
- Optimizer algorithm
- Learning rate decay
- Regularization rate
0
3
Updated 2021-10-23
Contributors are:
Who are from:
Tags
Data Science
Learn After
Depth and Width for Neural Networks
Dropout
Neural Network Learning Rate
Epochs in Machine Learning
Activation Functions in Neural Networks
Deep Learning Optimizer Algorithms
Batch Normalization in Deep Learning
Deep Learning Weight Initialization
Hyperparameters Tuning Methods in Deep Learning
Difference between Model Parameter and Model Hyperparameter
Regularization Constant