Learn Before
  • Popular Regularization Techniques in Deep Learning

Tangent Distance Algorithm

  • Early attempt to take advantage of the manifold hypothesis
  • Nonparametric nearest neighbor algorithm, metric used is derived from the manifolds near which probability concentrates
  • Assumes that data on the same manifold all has the same category
  • Classifier should be invariant to local factors of variation that correspond to movement. So we use the nearest neighbor distance between two points, which is the distance between the manifolds they belong to
  • Cheap alternative on a local level is to approximate each manifold by its tangent plane at a point and measure the distance between the two tangents, or between the tangent plane and a point, by solving a low-dimensional linear system.

0

1

4 years ago

References


Tags

Data Science

Related
  • Data Augmentation in Deep Learning

  • Early Stopping in Deep Learning

  • Dropout Regularization in Deep Learning

  • L2 Regularization (Weight Decay) in Deep Learning

  • Which of these techniques are useful for reducing variance (reducing overfitting)?

  • L1 Regularization in Deep Learning

  • ElasticNet Regression

  • If your Neural Network model seems to have high variance, what of the following would be promising things to try?

  • Regularization in ML and DL

  • Bagging in Deep Learning

  • Dropout in Deep Learning

  • Normalization of Data

  • Tangent Distance Algorithm

  • Tangent Propagation Algorithm

  • Manifold Tangent Classifier

  • Boosting in Deep Learning

  • Appropriate Regularization/ Representation