Learn Before
Concept

Transformers in contextual generation and summarization

The transformer model can also be used for the contextual generation task and text summarization task.

During the contextual generation, the model is given some prefix text and will output a possible completion to it. The transformer model can have direct access to all the prefix text and the subsequently generated output of its own.

As for the text summarization task, the training set contains multiple full-length articles accompanied by their summaries with a unique marker separating these two parts, where one training unit is like (x1,...,xm,δ,y1,...yn)(x_1,...,x_m,δ,y_1,...y_n). Teacher-forcing also applies during the training.

0

1

Updated 2021-11-21

Tags

Data Science

Related