Concept

Decoder Uncertainty

Token-level uncertainty varies in idiom vs non-idiom word translation. The authors first translate each sentence pair with teacher-forcing, measures the entropy of the decoder’s distributions for each target token. Lastly, they use word alignments to separately average the entropy values of target words assigned to idiom and non-idiom source words.

0

1

Updated 2023-02-17

Tags

Data Science

Learn After