Learn Before
Core Training Principle of XLM
Describe the key characteristic of the training data utilized in the Cross-Lingual Language Model (XLM) approach and explain the main advantage this provides for building a multilingual model.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Comprehension in Revised Bloom's Taxonomy
Cognitive Psychology
Psychology
Social Science
Empirical Science
Science
Related
Bilingual Sentence Packing for Pre-training
Pre-training Strategy for a Multilingual Model
A researcher is pre-training a multilingual model using a masked language modeling (MLM) objective. To align the pre-training process with the specific methodology of Cross-Lingual Language Models (XLMs), what is the most crucial characteristic of the input data?
Core Training Principle of XLM
Translation Language Modeling
Input Embedding in Cross-Lingual Language Models