If I change the requires_grad
to True
or False
midway after a few training, do I need to reinitialize the optimizer or loss function?
If I change the requires_grad
to True
or False
midway after a few training, do I need to reinitialize the optimizer or loss function?