Learn Before
Concept
Reducing Idiom Context
Reducing idiom context results in worse translation scores, and full-context pretraining yields the best result. Reducing idiom context barely affects random models but is catastrophic for pretrained models, implying that pertained models are more global and contain information related to surrounding idiom context while random models are myopic and contain information related to idiom tokens.
0
1
Updated 2023-02-17
Tags
Data Science