Skip to content

Add LoRA dropout #2046

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: sd3
Choose a base branch
from
Draft

Conversation

rockerBOO
Copy link
Contributor

LoRA Dropout as a Sparsity Regularizer for Overfitting Control
https://arxiv.org/abs/2404.09610

Header Header
Screenshot 2025-04-13 at 02-44-01 LoRA Dropout as a Sparsity Regularizer for Overfitting Control - 2404 09610v1 pdf Screenshot 2025-04-13 at 02-44-10 LoRA Dropout as a Sparsity Regularizer for Overfitting Control - 2404 09610v1 pdf

Note: only currently implemented for Flux

network_args = [
    "lora_dropout=0.5"
]
--network_args lora_dropout=0.5

Dropout weights of rows and columns of LoRA down/up
LoRA Dropout as a Sparsity Regularizer for Overfitting Control
@67372a
Copy link

67372a commented Jun 9, 2025

@rockerBOO for awareness, for reasons not clear, the authors have retracted / withdrawn their publication as per https://openreview.net/forum?id=c4498OydLP. Reading through some of the reviews, it seems like the authors made flawed assumptions and significant errors.

@rockerBOO
Copy link
Contributor Author

@67372a Thank you. I'll have to think about what that might mean for this PR. For now it'll probably just remain up in the draft state.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants