Example

BERT-based Encoder-Decoder for Neural Machine Translation

A neural machine translation (NMT) system serves as a specific application of the BERT-based encoder-decoder architecture. In this setup, a pre-trained BERT model functions as the encoder to process the source language text. The resulting contextual representation is then passed through an optional adapter to a decoder, which in turn generates the translated text in the target language.

0

1

Updated 2026-04-18

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Ch.1 Pre-training - Foundations of Large Language Models