Relation
Generation of Word Embeddings
-
Each word will be having a unique vector representation in a d-dimensional space which is learnt with the help of a large text corpus.
-
The advantage of word embeddings is that dissimilar but semantically similar words will attain the same vector representations. These are included as a feature for text classification tasks which displaces binary features such as presence of a word or frequency of a word.
-
In order to achieve hate speech detection on a passage or sentence, a vector representation of word vectors of each word in the sentence or passage is formed.
0
1
Updated 2022-08-14
Tags
Data Science