Concept

Gated recurrent unit (GRU)

Gated Recurrent Units (GRUs) resolve the vanishing gradients problem in simple RNNs and also ease the burden of introducing a considerable number of additional parameters in the LSTMs, by dispensing with the use of a separate context vector, and by reducing the number of gates to 2 — a reset/relevance gate and an update gate. The purpose of the reset gate is to decide which aspects of the previous hidden state are relevant to the current context and what can be ignored. It computes an intermediate representation for the new hidden state at the current time. The purpose of the update gate is to determine which aspects of this new state will be used directly in the new hidden state and which aspects of the previous state need to be preserved for future use.

0

1

Updated 2020-10-03

Tags

Data Science

Learn After