Learn Before
Concept

Vectorizing logistic regression on m examples

To compute logistic regression and its gradient descent on mm examples, we can use for loop to accumulate errors and derivatives and then average them, but it will take a long time to run on a big data set. So vectorization is a good way to get rid of explicit for loop in your code. First, stack the mm examples horizontally into vectors of XX and YY, so the shape of XX is (nx,m)(n_x, m), where nxn_x is the number of features, and the shape of YY is (1, m). Then, compute the ZZ and AA, [z(1)z(2)...z(m)]=Z=wTX+b[z^{(1)}z^{(2)}...z^{(m)}]=Z=w^TX+b =[(wTx(1)+b)(wTx(2)+b)...(wTx(m)+b)]=[(w^Tx{(1)}+b)(w^Tx{(2)}+b)...(w^Tx{(m)}+b)] [a(1)a(2)...a(m)]=A=σ(Z)[a{(1)} a{(2)} ... a{(m)}] = A=\sigma (Z) , the shape of ZZ and AA should be (1,m)(1, m). The derivatives of LL with respect to ZZ, ww, and bb are dLdz=AY\frac{d\mathcal L}{dz} = A - Y dLdw=1mX(dLdz)T\frac{d\mathcal L}{dw} = \frac{1}{m} X {(\frac{d\mathcal L}{dz})}^T dLdb=1mdLdz\frac{d\mathcal L}{db} = \frac{1}{m} \sum \frac{d\mathcal L}{dz}

0

1

Updated 2021-11-16

Tags

Data Science