Learn Before
Concept

RNN's Approach to Text Generation in NLP (Autoregressive Generation)

The technique (Shannon, 1951) generates a word in a sequence given the previous words, using cross-entropy as the loss function and perplexity as the evaluation method. The sequence ends once it reaches a predefined length. Each softmax layer is to decide the next word based on the current hidden state in the RNN component. The initial input token(s) would depend on the NLP task's nature (e.g. machine translation, question answering, summarization).

Image 0

0

2

Updated 2021-11-14

Tags

Data Science