Learn Before
Concept
Word Embeddings
In a simple word, word embeddings are numerical representations of texts. In detail, when computer treat differents words, they will turn them into some atomic symbols. e.g. cat might be a, dogs might be b. But how can computer find the relationship between them? That's what word embeddings are. They willl build some relationships between words.
0
2
Updated 2021-10-24
Contributors are:
Who are from:
Tags
Data Science