Learn Before
Example

Cross-Lingual Text Classification Example

A practical application of cross-lingual learning is text classification. In this scenario, a multilingual pre-trained model is first fine-tuned on a set of annotated documents in a source language, such as English. Subsequently, this fine-tuned model can be directly employed to classify documents in a target language, like Chinese, demonstrating its ability to transfer learned knowledge across languages.

0

1

Updated 2026-04-18

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences