Learn Before
Concept

Attention is all you Need (Presentation)

This paper is based on the working of a transformer model. A transformer model basically helps in transforming a sequence of input into another depending on the problem statement. These include translation of a language to another, or an answer for a question, with the help of an encoder and decoder model stacked together.

0

1

Updated 2021-08-19

Tags

Data Science

Related