Code

Python Vectorization Logistic Regression Gradient Descent

for iter in range(10000): # Iterations of Gradient Descent Z = np.dot(w.T, X) + b # Broadcasting A = sigma(Z) dLdZ = A - Y dLdW = (1/m) * X * dLdZ.T dLdb = (1/m) * np.sum(dLdZ) w = w - a * dLdW b = b - a * dLdb

0

1

Updated 2020-10-26

Tags

Data Science