kyttcar kyttcar -3 years ago 407
Python Question

Backward Propagation - Gradient error [Python]

I am working through Andrew Ng new deep learning Coursera course, week2.

We are supposed to implement a logistic regression algorithm.

I am stuck at gradient code (

dw
) - giving me a syntax error.

The algorithm is as follows:

import numpy as np

def propagate(w, b, X, Y):
m = X.shape[1]

A = sigmoid(np.dot(w.T,X) + b ) # compute activation
cost = -(1/m)*(np.sum(np.multiply(Y,np.log(A)) + np.multiply((1-Y),np.log(1-A)), axis=1)

dw =(1/m)*np.dot(X,(A-Y).T)
db = (1/m)*(np.sum(A-Y))
assert(dw.shape == w.shape)
assert(db.dtype == float)
cost = np.squeeze(cost)
assert(cost.shape == ())

grads = {"dw": dw,
"db": db}

return grads, cost


Any ideas why I keep on getting this syntax error?

File "<ipython-input-1-d104f7763626>", line 32
dw =(1/m)*np.dot(X,(A-Y).T)
^
SyntaxError: invalid syntax

Answer Source

In the line cost = ..., you are missing one parenthesis at the end, or just remove the one after *:

# ...
cost = -(1/m)*np.sum(np.multiply(Y,np.log(A)) + np.multiply((1-Y),np.log(1-A)), axis=1)
# ...
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download