DEV Community 👩‍💻👨‍💻

cm222
cm222

Posted on

How does PyTorch's requires_grad works ?

torch.autograd is PyTorch's automatic differentiation engine that powers neural network training. Let's try to understand from an example.

Here is a sample PyTorch calculates backward for us

x = torch.rand(1, requires_grad=True)
y = torch.rand(1)
v = x * y
w = torch.log(v)
xGrad = w.backward() # dw/dx
Enter fullscreen mode Exit fullscreen mode

Let's assume our output and move on with this numbers

x = tensor([0.0559], requires_grad=True)
y = tensor([0.5163])
w = tensor([-3.5450], grad_fn=<LogBackward0>)
xGrad = tensor([17.8828])
Enter fullscreen mode Exit fullscreen mode

This is an PyTorch generated graph

Image description

W calculation

1) Lets create functions from the generated graph by tracing backwards

Image description

Image description

x = tensor([0.0559], requires_grad=True)
y = tensor([0.5163])
v = 0.0559 * 0.5163
  = 0.02886117
w = ln(0.02886117)
  =  -3.5452581859176
Enter fullscreen mode Exit fullscreen mode

xGrad Calculation

This part needs some background with derivative rules and chain rule.

Image description

Image description

References

1) https://pytorch.org/blog/computational-graphs-constructed-in-pytorch/
2) https://www.youtube.com/watch?v=c36lUUr864M
3) https://pytorch.org/blog/overview-of-pytorch-autograd-engine/

Top comments (0)

🌚 Life is too short to browse without dark mode