Concept

Number of Layers to Freeze When Using a Pre-trained Model for Transfer Deep Learning

Because adding new layers requires training more parameters, it mainly depends on the size of your dataset. If your dataset is:

  • Small, only replace the last layer.
  • Large, replace multiple last layers with new layers.
  • Extremly large, and you have a lot of computational power, you can use the trained network for weight initialization and train all the weights in the whole network using your data.

0

1

Updated 2021-04-21

Tags

Data Science