Short Answer

Role of the Adapter in BERT-based NMT

In a neural machine translation system that uses a pre-trained, frozen BERT model as the encoder and a randomly initialized transformer model as the decoder, an 'adapter' layer is often placed between these two components. Explain the primary technical reason for including this adapter layer and describe one potential negative consequence of omitting it.

0

1

Updated 2025-10-10

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science