Learn Before
Concept
Model-Based DA Techniques in NLP
Seq2seq and language models have also been used for DA. Techniques include applying selective gates, generating different samples, and augmenting word representations with a context-sensitive attention-based mixture.
0
1
Updated 2026-05-02
Tags
Data Science
Learn After
BackTranslation
Improving Neural Machine Translation Models with Monolingual Data
Generative Data Augmentation for Commonsense Reasoning
G-DAUG
Keep Calm and Switch On! Preserving Sentiment and Fluency in Semantic Text Exchange
SEMANTIC TEXT EXCHANGE (STE)
LAMBADA
Not Enough Data? Deep Learning to the Rescue!
Trade-off of Model-Based Data Augmentation Techniques in NLP