Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[WIP] Add Latent Consistency Distillation Script #5517

Closed

Conversation

dg845
Copy link
Contributor

@dg845 dg845 commented Oct 25, 2023

What does this PR do?

This PR adds an example script to perform latent consistency distillation following Algorithm 1 in the latent consistency models paper. Latent consistency distillation (LCD) distills a latent diffusion model (LDM) such as Stable Diffusion to a latent consistency model (LCM), allowing fast onestep or few-step inference using the LCM while preserving the performance of the LDM.

Follow up to PR #5448 .

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@patrickvonplaten
@sayakpaul
@luosiallen

@Amin456789
Copy link

Amin456789 commented Oct 25, 2023

nice! hope it supports safetensors so we can convert the models to fp16 like regular models for a lower size and a better ram and cpu usage

@DN6
Copy link
Collaborator

DN6 commented Oct 26, 2023

@dg845 Could you please run make style && make quality on the PR so the code QC checks pass.
Is it possible to run this script in a Colab notebook to verify that training is working as expected? If so, could you please include a link to the Colab as well.

@dg845
Copy link
Contributor Author

dg845 commented Oct 26, 2023

Hi @DN6, the script is not finished yet. I am currently working with @luosiallen and @patrickvonplaten on this.

@dg845
Copy link
Contributor Author

dg845 commented Nov 14, 2023

Closing because PR #5727 has been merged.

@dg845 dg845 closed this Nov 14, 2023
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants