-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Add work-in-progress for visualizing gradients tutorial (issue #3186) #3389
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/3389
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 0b9f56a with merge base 06f9c4b ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Generally seems to be headed in the right direction in terms of tone and organization from my perspective. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the working on this tutorial. Overall I'd say though that this section (prior to the actual visualizing gradients part) can be much shorter.
By the end of this tutorial, you will be able to:
Differentiate between leaf and non-leaf tensors
have a diagram from https://github.com/szagoruyko/pytorchviz, point to the leafs
Know when to use\
retain_grad
vs. ``require_grad`
"use requires_grad for leaf, use retain_grad for non-leaf"
Still a work in progress, but I significantly reduced the first section and added some helpful images for the computational graph. I also added links for most terms. The WIP section with ResNet I still have to debug. I'm not sure my method for retaining the intermediate gradients is valid. See discussion on pull request.
Thank you for the comments, they were really helpful. Let me know if you think the first section is still too long. Concerning the "visualizing gradients" section with an actual example, I'm not sure if I'm going about retaining the gradients for intermediate tensors correctly. My thought process was to use a forward hook, call Initially I tried using a backward pass hook like RuntimeError: Output 0 of BackwardHookFunctionBackward is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function. I know that I can plot the gradients for the parameters by just looping through the If anyone sees a problem with my method let me know. The current state of the code isn't doing what I expected so I still have to debug it. |
Fixes #3186
Description
Add initial draft for visualizing gradients tutorial. Link is here
This write-up starts by discussing the difference between leaf and non-leaf tensors, and the associated
requires_grad
andretains_grad
class attributes.It then will go through a real-world example of visualizing gradients by using the
retains_grad
in a more complicated neural network like ResNet (this part is a work-in-progress).I put the tutorial in the advanced_source directory but perhaps it would be better sorted as an intermediate tutorial or a recipe. Open to suggestions.
What's written so far is how I imagined structuring the tutorial. If you have any comments about the overall flow / material let me know. Feel free to comment on the wordage and tone as well, just know that I plan on revising tutorial as this is just the first go at it.
Checklist