You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First up, thanks for this is great work! I was just wondering if there's a way to compile tiny-cuda-nn and its pytorch bindings to use float32 by default? Thanks!
The text was updated successfully, but these errors were encountered:
srxdev0619
changed the title
Complie PyTorch with float32 precision.
Complie PyTorch Bindings with float32 precision.
Feb 27, 2022
Hi,
First up, thanks for this is great work! I was just wondering if there's a way to compile
tiny-cuda-nn
and its pytorch bindings to usefloat32
by default? Thanks!The text was updated successfully, but these errors were encountered: