Improved Concept Embeddings for Learning Prerequisite Chains: Generating Poincaré embeddings from natural-language text corpora
To generate Poincaré embeddings from a natural language text corpus, the author converted the text to a list of directed edges in the format , where is a hypernym of , as input to the Poincaré embeddings algorithm, using the method described by Dhingra et al. (2018). They constructed a co-occurrence graph that includes all pairs of words that occur within a fixed window of each other. Each edge in the graph has a weight , where is the frequency of the co-occurrence pairs in the corpus and is a downsampling constant.
0
1
Tags
Data Science
Related
Improved Concept Embeddings for Learning Prerequisite Chains: Datasets
Improved Concept Embeddings for Learning Prerequisite Chains: Poincaré embeddings
Improved Concept Embeddings for Learning Prerequisite Chains: Generating Poincaré embeddings from natural-language text corpora
Improved Concept Embeddings for Learning Prerequisite Chains: Model parameters
Improved Concept Embeddings for Learning Prerequisite Chains: Evaluation methods
Learn After
Reference for generating Poincaré embeddings from natural-language text corpora
Improved Concept Embeddings for Learning Prerequisite Chains: Co-occurrence
Improved Concept Embeddings for Learning Prerequisite Chains: Window sizes
Improved Concept Embeddings for Learning Prerequisite Chains: Downsampling constants