Learn Before
Cross-Lingual Learning
Cross-lingual learning is a technique where a model is trained for a specific task in one language and then utilized to perform the same task in a different language. This approach leverages knowledge gained from a data-rich language to apply to languages with fewer resources, using multi-lingual pre-trained models as a foundation.
0
1
Tags
Ch.1 Pre-training - Foundations of Large Language Models
Foundations of Large Language Models
Foundations of Large Language Models Course
Computing Sciences
Related
Cross-Lingual Learning
Bilingual Pre-training for Multilingual Models
Benefit of Multilingual Pre-trained Models: Handling Code-Switching
Shared Vocabulary in Multilingual Models
Factors Influencing Multilingual Pre-training
A company is developing a sentiment analysis tool. Their primary market is in France, for which they have a massive, high-quality dataset. They also need to provide functional support for Spanish and German, but have very limited data for these languages. The highest priority is achieving state-of-the-art performance for the French market, while still being able to handle the other languages. Given these requirements, which strategy for choosing a foundational model is most appropriate?
Model Selection for a Monolingual Task
Match each pre-trained model with the description that best characterizes its training methodology and primary use case.
Learn After
Cross-Lingual Text Classification Example
Cross-Lingual Transfer from High-Resource to Low-Resource Languages
A development team has a large, high-quality dataset for sentiment analysis in English. They need to create a similar sentiment analysis tool for Swahili, a language for which they have very little labeled data. The team has access to a powerful multilingual model pre-trained on a corpus including both English and Swahili. Based on the principles of leveraging knowledge from a data-rich language for a data-poor one, what is the most direct and effective strategy for the team to pursue?
Analyzing a Cross-Lingual Model Implementation Failure
Explaining Zero-Shot Cross-Lingual Transfer