logo
How it worksCoursesResearch CommunitiesBenefitsAbout Us
Schedule Demo
Learn Before
  • Forward Propagation

    Concept icon
Multiple Choice

Which of these is a correct vectorized implementation of forward propagation for layer ℓ\ellℓ, where 1≤ℓ\ellℓ≤L\mathcal{L}L?

  • Z[ℓ]^{[\ell]}[ℓ] = W([ℓ])^([\ell])([ℓ])A([ℓ])^([\ell])([ℓ]) + b([ℓ])^([\ell])([ℓ])

0

1

Updated 2021-11-18

Contributors are:

Grace Dwyer
Grace Dwyer
🏆 1

Who are from:

University of Michigan - Ann Arbor
University of Michigan - Ann Arbor
🏆 1

References


  • Machine Learning Yearning (Deeplearning.ai)

Tags

Data Science

Related
  • Connection between the Layers of Neural Network

    Concept icon
  • Forward Propagation Formulation

    Concept icon
  • True/False: During forward propagation, in the forward function for a layer ll you need to know what is the activation function in a layer (Sigmoid, tanh, ReLU, etc.). During back propagation, the corresponding backward function also needs to know what is the activation function for layer ll, since the gradient depends on it.

  • Which of these is a correct vectorized implementation of forward propagation for layer ℓ\ellℓ, where 1≤ℓ\ellℓ≤L\mathcal{L}L?

    • Z[ℓ]^{[\ell]}[ℓ] = W([ℓ])^([\ell])([ℓ])A([ℓ])^([\ell])([ℓ]) + b([ℓ])^([\ell])([ℓ])
logo 1cademy1Cademy

Optimize Scalable Learning and Teaching

How it worksCoursesResearch CommunitiesBenefitsAbout Us
TermsPrivacyCookieGDPR

Contact Us

iman@honor.education

Follow Us




© 1Cademy 2026

We're committed to OpenSource on

Github