-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Motion Model / Adapter versatility #8301
Motion Model / Adapter versatility #8301
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
- allow to use a different number of layers per block - allow to use a different number of transformer per layers per block - allow a different number of motion attention head per block - use dropout argument in get_down/up_block in 3d blocks
c3525f9
to
4678f9c
Compare
The suggested changes have been made, thanks |
Could we add a fast test for the model here: I think we we should test creating an asymmetric UNetMotionModel. I'd just like to confirm that the updated parameters aren't breaking anything. |
Of course, a forward test for the most asymmetrical UnetMotionModel possible is added 😃 |
Are some of the tests failing because of my commits ? |
Failing tests are unrelated @Arlaz. Merging this. |
* Motion Model / Adapter versatility - allow to use a different number of layers per block - allow to use a different number of transformer per layers per block - allow a different number of motion attention head per block - use dropout argument in get_down/up_block in 3d blocks * Motion Model added arguments renamed & refactoring * Add test for asymmetric UNetMotionModel
In the MotionModel and MotionAdapter for AnimateDiff pipeline :
These modifications are needed for some custom trained model that follow the SDXL architecture more closely (less down block) and more Transformer block per layers.
New arguments / tuple instead of int can now be used, as in the following example :
Important : this PR does not break any existing code
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
@sayakpaul @yiyixuxu @DN6