Learn Before
  • Stages of Feed Forward Neural Network Learning

Forward Propagation

When we use a feedforward neural network to accept an input xx from training set and produce an output y^\hat{y}, information flows forward through the network. The input xx provides the initial information that then propagates up to the hidden units at each layer and finally produces y^\hat{y}. This is called forward propagation.

0

1

4 years ago

References


Tags

Data Science

Related
  • Forward Propagation

  • Objective Function

  • Update Weight Iteratively Until Convergence

  • Deep Learning Weight Initialization

  • What is the "cache" used for in our implementation of forward propagation and backward propagation?

  • Consider the following 1 hidden layer neural network:

  • Which of the following are true regarding activation outputs and vectors? (Check all that apply.)

  • Backward Propagation

Learn After
  • Connection between the Layers of Neural Network

  • Forward Propagation Formulation

  • True/False: During forward propagation, in the forward function for a layer ll you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During back propagation, the corresponding backward function also needs to know what is the activation function for layer ll, since the gradient depends on it.

  • Which of these is a correct vectorized implementation of forward propagation for layer \ell, where 1≤\ellL\mathcal{L}?

    • Z[]^{[\ell]} = W([])^([\ell])A([])^([\ell]) + b([])^([\ell])