Learn Before
Modeling Sequences Conditioned on Context with RNNs
"Conditioning" in the context of sequence to sequence learning in RNNs is the process of computing the probability of obtaining the output sequence conditioned on the input sequence, or p(y|x). The network is used to model this conditional probability mapping
We can extend a model to represent a distribution P(y|x) by using 3 common methods:
- As an extra input at each time step, or
- As the initial state h(0), or
- Both
0
1
Tags
Data Science
Related
Applications of RNN
RNN Basic Structure
RNN Extensions and Types
Loss Function for RNN
RNNs(Recurrent Neural Networks) vs HMMs (Hidden Markov Models)
RNNs vs Feedforward Neural Networks
Hybrid of Convolutional and Recurrent Neural Network
Why is an RNN (Recurrent Neural Network) used for machine translation, say translating English to French? (Check all that apply.)
RNN Problem
Different types of RNN (in terms of input/output)
Long Term Dependencies Problem
Modeling Sequences Conditioned on Context with RNNs
Leaky Units and Other Strategies for Multiple Time Scales
Convolutional Recurrent Neural Network (CRNN)
Pooling Layer in RNN
Inability of RNNs to Carry Forward Critical Information
Stacked RNNs
Bidirectional RNNs