Concept

Parameter Sharing

While a parameter norm penalty is one way to regularize parameters to be close to one another, the more popular way is to use constraints: to force sets of parameters to be equal. This method of regularization is often referred to as parameter sharing, because we interpret the various models or model components as sharing a unique set of parameters. A significant advantage of parameter sharing over regularizing the parameters to be close (via a norm penalty) is that only a subset of the parameters (the unique set) needs to be stored in memory. In certain models—such as the convolutional neural network—this can lead to significant reduction in the memory footprint of the model.

 Parameter sharing: to force sets of parameters to be equal

 Main application: Convolutional Neural Network (CNN)

 Advantages: significant reduction in the memory footprint of the model

0

1

Updated 2021-07-15

References


Tags

Data Science