torch.autograd is PyTorch's automatic differentiation engine that powers neural network training. Let's try to understand from an example.
x = torch.rand(1, requires_grad=True) y = torch.rand(1) v = x * y w = torch.log(v) xGrad = w.backward() # dw/dx
x = tensor([0.0559], requires_grad=True) y = tensor([0.5163]) w = tensor([-3.5450], grad_fn=<LogBackward0>) xGrad = tensor([17.8828])
1) Lets create functions from the generated graph by tracing backwards
x = tensor([0.0559], requires_grad=True) y = tensor([0.5163]) v = 0.0559 * 0.5163 = 0.02886117 w = ln(0.02886117) = -3.5452581859176