Skip to content

Allow pipelining of constant values for preproc + handle nested preprocs #2342

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

sarckk
Copy link
Member

@sarckk sarckk commented Aug 28, 2024

Summary:
Ran into 2 issues while enabling pipeline for a model:

  1. Current pipeline logic for finding and swapping a preproc module only works if the preproc module exists at model level. If the preproc is within a model's child modules, this logic would break down e.g. model._sparse_arch._preproc_module. Finding a module would not work as this used getattr on the model and swapping the module would fail as this used setattr on the model. Solution:

    • Replaced getattr and setattr with _find_preproc_module_recursive and _swap_preproc_module_recursive respectively.
  2. Logic doesn't support if an arg to a preproc module is a constant (e.g. self.model.constant_value) as we skip args that aren't torch.fx.Node values. However, we should be able to pipeline these cases. Solution:

    • Add a new field to ArgInfo called objects of type List[Optional[object]]. After fx tracing, you will have fx immutable collections, such as torch.fx.immutable_dict for immutable Dict. Creating a copy converts it back to mutable original value. So we capture this variable in ArgInfo. Potential downside is the extra memory overhead, but for this model in particular, this was just a small string value.

Reviewed By: xing-liu

Differential Revision: D61891459

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 28, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D61891459

…ocs (pytorch#2342)

Summary:
Pull Request resolved: pytorch#2342

Ran into 2 issues while enabling pipeline for a model:
1) Current pipeline logic for finding and swapping a preproc module only works if the preproc module exists at model level. If the preproc is within a model's child modules, this logic would break down e.g. `model._sparse_arch._preproc_module`. Finding a module would not work as this used `getattr` on the model and swapping the module would fail as this used `setattr` on the model. Solution:
   - Replaced `getattr` and `setattr` with `_find_preproc_module_recursive` and `_swap_preproc_module_recursive` respectively.

2) Logic doesn't support if an arg to a preproc module is a constant (e.g. `self.model.constant_value`) as we skip args that aren't `torch.fx.Node` values. However, we should be able to pipeline these cases. Solution:
    - Add a new field to `ArgInfo` called `objects` of type `List[Optional[object]]`. After fx tracing, you will have fx immutable collections, such as `torch.fx.immutable_dict` for immutable `Dict`. Creating a copy converts it back to mutable original value. So we capture this variable in `ArgInfo`. Potential downside is the extra memory overhead, but for this model in particular, this was just a small string value.

Reviewed By: xing-liu

Differential Revision: D61891459
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D61891459

sarckk added a commit to sarckk/torchrec that referenced this pull request Aug 30, 2024
Summary:
if `arg` to embedding module is `None`, we would ignore it. However, now we also use `_get_node_args_helper` to generate arg list info for preproc modules, and sometimes `None` is passed in as arg/kwarg.

With changes in pytorch#2342, we can now handle constants. For backward compatibility, adding an optional flag to indicate to `_get_node_args_helper` that we are handling preproc modules.

Reviewed By: xing-liu

Differential Revision: D61938346
sarckk added a commit to sarckk/torchrec that referenced this pull request Aug 30, 2024
Summary:
Pull Request resolved: pytorch#2352

if `arg` to embedding module is `None`, we would ignore it. However, now we also use `_get_node_args_helper` to generate arg list info for preproc modules, and sometimes `None` is passed in as arg/kwarg.

With changes in pytorch#2342, we can now handle constants. For backward compatibility, adding an optional flag to indicate to `_get_node_args_helper` that we are handling preproc modules.

Reviewed By: xing-liu

Differential Revision: D61938346
sarckk added a commit to sarckk/torchrec that referenced this pull request Sep 5, 2024
Summary:
Pull Request resolved: pytorch#2352

if `arg` to embedding module is `None`, we would ignore it. However, now we also use `_get_node_args_helper` to generate arg list info for preproc modules, and sometimes `None` is passed in as arg/kwarg.

With changes in pytorch#2342, we can now handle constants. For backward compatibility, adding an optional flag to indicate to `_get_node_args_helper` that we are handling preproc modules.

Reviewed By: xing-liu

Differential Revision: D61938346
facebook-github-bot pushed a commit that referenced this pull request Sep 5, 2024
Summary:
Pull Request resolved: #2352

if `arg` to embedding module is `None`, we would ignore it. However, now we also use `_get_node_args_helper` to generate arg list info for preproc modules, and sometimes `None` is passed in as arg/kwarg.

With changes in #2342, we can now handle constants. For backward compatibility, adding an optional flag to indicate to `_get_node_args_helper` that we are handling preproc modules.

Reviewed By: xing-liu

Differential Revision: D61938346

fbshipit-source-id: 1514fb0432be10b7fa46f90050ecbe370cf07a4c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants