Learn Before
Concept
Parameter Tying
Parameter tying refers to the fact that the two models, A and B, have a certain correlation. For example, both models are doing a classification task. Based on this prior knowledge, we can assume that the parameters of model A and model B should be similar, and then we can use this information through regularization.
0
1
Updated 2021-06-24
Tags
Data Science
Related
Why does regularization prevent overfitting?
Popular Regularization Techniques in Deep Learning
Human Level Performance: Based on the evidence below, which two of the following four options seem the most promising to try?
Local Constancy and Smoothness Priors
Parameter Sharing
Parameter Tying
L1 regularization and L2 regularization
MTL as a Regularizer
Parameter Penalties in Classical Regularization