Concept

L1 Regularization

Unlike 2\ell_2 regularization which distributes weights evenly, 1\ell_1 regularization penalizes the absolute values of the weights. This leads to models that concentrate weights on a small set of features by clearing the other weights to zero, making it an effective method for feature selection. If a model only relies on a few features, it may eliminate the need to collect, store, or transmit data for the dropped features. Linear models that are 1\ell_1-regularized are popularly known as lasso regression.

0

2

Updated 2026-05-03

Tags

Data Science

D2L

Dive into Deep Learning @ D2L

Learn After