You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def update(self, w, grad_func):
# Calculate the gradient of the loss a bit further down the slope from w
if self.w_updt is None:
self.w_updt = np.zeros(np.shape(w))
# print("shape og w",w.shape)
# print("shape og w",self.w_updt.shape)
approx_future_grad = np.clip((grad_func(w - self.momentum * self.w_updt)), -1, 1)
#print(approx_future_grad)
# Initialize on first update
if not self.w_updt.any():
self.w_updt = np.zeros(np.shape(w))
self.w_updt = self.momentum * self.w_updt + self.learning_rate * approx_future_grad
# Move against the gradient to minimize loss
return w - self.w_updt
Here grad_func is not implemented!
The text was updated successfully, but these errors were encountered:
class NesterovAcceleratedGradient():
def init(self, learning_rate=0.001, momentum=0.4):
self.learning_rate = learning_rate
self.momentum = momentum
self.w_updt = None#np.array([])
Here grad_func is not implemented!
The text was updated successfully, but these errors were encountered: