You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Feb 7, 2025. It is now read-only.
In the Latent inferers, in some cases, if the latent tensor shape is the same as the one you have to pad to (because the users populated the fields ldm_latent_shape and vae_latent_shape), the code fails because the latent tensor's got a track_meta flag that tries to calculate the inverse transform when calling the resizer and it fails with an error.
A check, before padding, that verifies that padding is necessary (ldm_latent_shape != latent.shape) will overcome this problem.
The text was updated successfully, but these errors were encountered:
In the Latent inferers, in some cases, if the latent tensor shape is the same as the one you have to pad to (because the users populated the fields ldm_latent_shape and vae_latent_shape), the code fails because the latent tensor's got a track_meta flag that tries to calculate the inverse transform when calling the resizer and it fails with an error.
A check, before padding, that verifies that padding is necessary (ldm_latent_shape != latent.shape) will overcome this problem.
The text was updated successfully, but these errors were encountered: