pytorch api torch.optim.SGD

版权声明:本文为作者创作,转载请注明出处:http://blog.csdn.net/claroja,如有商业用途请联系QQ:63183535。 https://blog.csdn.net/claroja/article/details/85327731

CLASS torch.optim.SGD(params, lr=<required parameter>, momentum=0, dampening=0, weight_decay=0, nesterov=False)
SGD梯度下降

参数 描述
params 可迭代的参数
lr (float)学习速率
momentum (float)
weight_decay (float)
dampening (float)
nesterov (bool)

参考文献:
https://pytorch.org/docs/stable/optim.html#torch.optim.SGD

猜你喜欢

转载自blog.csdn.net/claroja/article/details/85327731