Implementing Freezing Layers When Using a Pre-trained Model for Transfer Deep Learning
Different libraries have various parameters for freezing some layers of the network to only train the other layers. For example, in keras, you can use trainable = False and in some other libraries, freeze = 1.
0
1
Tags
Data Science
Related
POPULAR PRE-TRAINED MODELS
Number of Layers to Freeze When Using a Pre-trained Model for Transfer Deep Learning
Implementing Freezing Layers When Using a Pre-trained Model for Transfer Deep Learning
An engineer needs to build a model to classify 15 types of local wildflowers using a custom dataset of only 900 images. They select a very deep and complex neural network that was previously trained on a dataset of over a million general-purpose images (e.g., animals, vehicles, household objects). The engineer's strategy is to retrain all layers of this complex network from scratch, using only their small wildflower dataset. What is the most likely outcome of this strategy?
You are tasked with building an image classifier for a new, specialized task (e.g., identifying specific types of industrial equipment), but you only have a small, custom dataset. You decide to adapt a model that has already been trained on a very large, general image dataset. Arrange the following steps in the correct logical order to implement this strategy.
Adapting a Pre-trained Network for a New Task