Learn Before
Kernel Based Embeddings
Kernel embeddings for learning machines have proven themselves to achieve great performance through strong representational power. To leverage this performance, Lopez-Paz et al. introduced kernel-based embedding for feature construction in pairwise causal discovery.
Starting from the dataset of empirical distributions , a kernel mean embedding allows to project all those empirical distributions into the same Reproducing Kernel Hilbert Space (RKHS) . To obtain a homogeneous and low dimension embedding, Lopez-Paz et al. uses random cosine based embeddings that approximate empirical kernel mean embeddings in low dimension: where are the kernel parameters sampled i.i.d. in , as well as their number defining the number of dimensions of the output space, is the empirical distribution, and , with the positive and integrable Fourier transform of the chosen kernel , equal to 1 in this case.
0
1
Tags
Data Science