Number of Layers to Freeze When Using a Pre-trained Model for Transfer Deep Learning
Because adding new layers requires training more parameters, it mainly depends on the size of your dataset. If your dataset is:
- Small, only replace the last layer.
- Large, replace multiple last layers with new layers.
- Extremly large, and you have a lot of computational power, you can use the trained network for weight initialization and train all the weights in the whole network using your data.
0
1
Tags
Data Science
Related
POPULAR PRE-TRAINED MODELS
Number of Layers to Freeze When Using a Pre-trained Model for Transfer Deep Learning
Implementing Freezing Layers When Using a Pre-trained Model for Transfer Deep Learning
An engineer needs to build a model to classify 15 types of local wildflowers using a custom dataset of only 900 images. They select a very deep and complex neural network that was previously trained on a dataset of over a million general-purpose images (e.g., animals, vehicles, household objects). The engineer's strategy is to retrain all layers of this complex network from scratch, using only their small wildflower dataset. What is the most likely outcome of this strategy?
You are tasked with building an image classifier for a new, specialized task (e.g., identifying specific types of industrial equipment), but you only have a small, custom dataset. You decide to adapt a model that has already been trained on a very large, general image dataset. Arrange the following steps in the correct logical order to implement this strategy.
Adapting a Pre-trained Network for a New Task