版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/jacke121/article/details/82733197
element 0 of tensors does not require grad and does not have a grad_fn
这个是因为requires_grad=False,应该为true
import numpy as np import torch from torch.autograd import Variable x = Variable(torch.ones(2,2),requires_grad=False) y = x + 2 # print(x.creator) # None,用户直接创建没有creater属性 # print(y.creator) # <torch.autograd._functions.basic_ops.AddConstant object at 0x7fb9b4d4b208> z = y*y*3 out = z.mean() out.backward() print(x,y,z) print(x.grad) # 输出对out对x求倒结果 print(y.grad) # y不是自动求导变量