Closed
Description
Describe the bug
When I use the FluxTransformer2DModel.from_single_file method, the provided token seems to be invalid, and I am unable to download files that require token verification, such as the official Flux model files.
However, I can confirm that the token is valid because when using FluxPipeline.from_single_file to load the official Flux model, all files can be downloaded successfully without any issues.
Reproduction
from diffusers import FluxTransformer2DModel
FluxTransformer2DModel.from_single_file(“A Local File Path”,token=...)
It does not work, and raise an error in the log below.
from diffusers import FluxTransformer2DModel
FluxTransformer2DModel.from_single_file(“A HF URL”,token=...)
It works too.
So I think the problem only exists when a local file path is passed to this method.
When a HF url is passed to it, it works.
from diffusers import FluxPipeline
FluxPipeline.from_single_file("black-forest-labs/FLUX.1-dev",token=...)
It works.
Logs
...
HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/black-forest-labs/FLUX.1-dev/resolve/main/transformer/config.json
...
OSError: black-forest-labs/FLUX.1-dev is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with `token` or log in with `huggingface-cli login`.
System Info
- 🤗 Diffusers version: 0.31.0
- Platform: Linux-6.6.56+-x86_64-with-glibc2.35
- Running on Google Colab?: No
- Python version: 3.10.14
- PyTorch version (GPU?): 2.4.0 (True)
- Flax version (CPU?/GPU?/TPU?): 0.8.4 (gpu)
- Jax version: 0.4.26
- JaxLib version: 0.4.26.dev20240620
- Huggingface_hub version: 0.25.1
- Transformers version: 4.45.1
- Accelerate version: 0.34.2
- PEFT version: 0.13.2
- Bitsandbytes version: not installed
- Safetensors version: 0.4.5
- xFormers version: not installed
- Accelerator: Tesla T4, 15360 MiB
- Tesla T4, 15360 MiB
- Using GPU in script?: No
- Using distributed or parallel set-up in script?: No