Jupyter notebook error for changing grad func using With .no_grad():

C:\Anaconda3\lib\site-packages\ipykernel_launcher.py:3: UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won’t be populated during autograd.backward(). If you indeed want the gradient for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more informations.
This is separate from the ipykernel package so we can avoid doing imports until

If you’re using no_grad() you’re literally telling pytorch that you won’t need the gradients. Pytorch seeing this doesn’t compute them, saving memory and computational time. This is most often used during an evaluation, where only the result matters.

If you really wish to use grad attribute (and I honestly have no idea why would you need to modify it in the first place, since it’s the job of optimizer), you need to remove the no_grad() block.