Learn Before
Arbitrary Distribution Shift
When data distributions shift between training and testing in arbitrary, unconstrained ways, learning a robust classifier is fundamentally impossible. For instance, in a binary classification task like distinguishing cats from dogs, if the input distribution remains exactly the same but all labels are deterministically flipped such that , an algorithm cannot distinguish this pathological scenario from one where the distribution never changed at all.
0
1
Tags
D2L
Dive into Deep Learning @ D2L
Related
Arbitrary Distribution Shift
Covariate Shift
Label Shift
Concept Shift
Nonstationary Distribution
Self-Driving Cars Example of Distribution Shift
Tank Detection Example of Distribution Shift
Face Detection Example of Distribution Shift
Web Search Example of Distribution Shift
Class Imbalance Example of Distribution Shift