You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
move out zero grad logic into separate function (#969)
Summary:
Pull Request resolved: #969
# Context
Currently it isn't possible to log gradients from AutoUnit as they are zeroed out before `on_train_step_end()` is reached.
# This Diff
Moves out the zeroed grad from the `_update_weights` and into it's own function. Can be overridden, ie
```
class MyAutoUnit(AutoUnit):
...
def zero_grad(self) ->
self.logger.log(self.module.grad)
super().zero_grad()
```
to log the gradients prior to zeroing them out
Reviewed By: galrotem, diego-urgell
Differential Revision: D68983117
fbshipit-source-id: 744b72c5634d8b6979ef1145fc3254ddde93d743
0 commit comments