Concept

Implementing Freezing Layers When Using a Pre-trained Model for Transfer Deep Learning

Different libraries have various parameters for freezing some layers of the network to only train the other layers. For example, in keras, you can use trainable = False and in some other libraries, freeze = 1.

0

1

Updated 2021-04-21

Tags

Data Science