我在pytorch中实现了以下Jacobian函数。除非我犯了错误,否则它计算任何张量w.r.t.的雅可比。任何维输入:import torch
import torch.autograd as ag
def nd_range(stop, dims = None):
if dims == None:
dims = len(stop)
if not dims:
yield ()
return
for outer in nd_range(stop, dims - 1):
for inner in range(stop[dims - 1]):
yield outer + (inner,)
def full_jacobian(f, wrt):
f_shape = list(f.size())
wrt_shape = list(wrt.size())
fs = []
f_range = nd_range(f_shape)
wrt_range = nd_range(wrt_shape)
for f_ind in f_range:
grad = ag.grad(f[tuple(f_ind)], wrt, retain_graph=True, create_graph=True)[0]
for i in range(len(f_shape)):
grad = grad.unsqueeze(0)
fs.append(grad)
fj = torch.cat(fs, dim=0)
fj = fj.view(f_shape + wrt_shape)
return fj
除此之外,我还尝试实现一个递归函数来计算n阶导数:
^{pr2}$
我做了一个简单的测试:op = torch.ger(s, s)
deep_deriv = nth_derivative(op, s, 5)
不幸的是,这成功地得到了Hessian…但是没有高阶导数。我知道很多高阶导数应该是0,但我更希望Pythorch能解析地计算出来。在
一个解决办法是将坡度计算改为:try:
grad = ag.grad(f[tuple(f_ind)], wrt, retain_graph=True, create_graph=True)[0]
except:
grad = torch.zeros_like(wrt)
这是公认的正确处理方法吗?还是有更好的选择?还是说我的问题一开始就完全错了?在
如果觉得《python求高阶导数_Pythorch中的高阶梯度》对你有帮助,请点赞、收藏,并留下你的观点哦!