Skip to content

Protect get_default_device for torch<2.3 #38376

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
May 26, 2025
Merged

Protect get_default_device for torch<2.3 #38376

merged 3 commits into from
May 26, 2025

Conversation

Cyrilvallez
Copy link
Member

@Cyrilvallez Cyrilvallez commented May 26, 2025

What does this PR do?

As per the title! It was reported in #38329!
set_default_device has been around for a long time, but not get_default_device interestingly!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@ydshieh
Copy link
Collaborator

ydshieh commented May 26, 2025

need push an empty commit. The circleci has some issue that I have to fix.

@Cyrilvallez Cyrilvallez merged commit b5b76b5 into main May 26, 2025
19 of 21 checks passed
@Cyrilvallez Cyrilvallez deleted the fix-device-version branch May 26, 2025 13:00
@Cyrilvallez Cyrilvallez added the for patch Tag issues / labels that should be included in the next patch label May 27, 2025
ArthurZucker pushed a commit that referenced this pull request May 27, 2025
* Update modeling_utils.py

* CIs
ArthurZucker pushed a commit that referenced this pull request May 27, 2025
* Update modeling_utils.py

* CIs
ArthurZucker pushed a commit that referenced this pull request May 27, 2025
* Update modeling_utils.py

* CIs
ArthurZucker pushed a commit that referenced this pull request May 28, 2025
* Update modeling_utils.py

* CIs
@peter-crist
Copy link

I'm on torch 2.2.2. In order to update transformers 0.42.4 to include this fix, I also have a requirement to update to torch 2.6.0 from this PR addressing the weights_only CVE:
#37785

Seems I can't upgrade transformers unless I upgrade torch (which given 2.6 is a breaking change, it would take more time to validate than a simple upgrade). And even when I do upgrade, I won't actually need this fix. Forgive me if I'm off the mark here.

@ydshieh
Copy link
Collaborator

ydshieh commented Jun 6, 2025

I'm sorry, but if you don't need this fix (i.e. this PR), what's the motivation to update transformers? I must misunderstand something

@Cyrilvallez
Copy link
Member Author

Hey @peter-crist! There are no hard requirements on torch 2.6 unless you try to use torch.load - so the fix is still valid as most model checkpoints are safetensors (which, as the name indicates, are intrinsically safe). You can still use torch 2.2 with safetensors checkpoints thanks to the fix

redmoe-moutain pushed a commit to redmoe-moutain/transformers that referenced this pull request Jun 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
for patch Tag issues / labels that should be included in the next patch
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants