Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

add automatic mixed precision #20

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

add automatic mixed precision #20

wants to merge 1 commit into from

Conversation

mengdong
Copy link

Automatic Mixed Precision for Tensorflow has been recently introduced:
https://medium.com/tensorflow/automatic-mixed-precision-in-tensorflow-for-faster-ai-training-on-nvidia-gpus-6033234b2540

This PR adds automatic mixed precision to self attention GAN training. We've tested speed/convergence impact on V100/T4. We have seen about 45% increase in training speed and not hurting the d_loss and g_loss. The generated image also looks similar to the original FP32 trained model.

@mengdong
Copy link
Author

mengdong commented Oct 1, 2019

@DoctorTeeth could you please help and review? Thanks

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant