Skip to content Skip to sidebar Skip to footer

Imorting Zero_gradients From Torch.autograd.gradcheck

I want to replicate the code here, and I get the following error while running in Google Colab? ImportError: cannot import name 'zero_gradients' from 'torch.autograd.gradcheck' (/

Solution 1:

This seems like it's using a very old version of PyTorch, the function itself is not available anymore. However, if you look at this commit, you will see the implementation of zero_gradients. What it does is simply zero out the gradient of the input:

def zero_gradients(i):
    for t in iter_gradients(i):
        t.zero_()

Then zero_gradients(x) should be the same as x.zero_grad(), which is the current API, assuming x is a nn.Module!

Or it's just:

if x.grad is not None:
    x.grad.zero_()

Post a Comment for "Imorting Zero_gradients From Torch.autograd.gradcheck"