Skip to content

Add NoGradGuard to libtorch execution #263

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Merged
merged 1 commit into from
Jan 20, 2020
Merged

Add NoGradGuard to libtorch execution #263

merged 1 commit into from
Jan 20, 2020

Conversation

lantiga
Copy link
Contributor

@lantiga lantiga commented Jan 20, 2020

Since we cannot be sure that users set requires_grad to False in all their model parameters, we instantiate a torch::NoGradGuard to ensure no gradients are tracked, for efficiency.

@lantiga lantiga merged commit 9e09de7 into master Jan 20, 2020
lantiga added a commit that referenced this pull request May 6, 2020
 
Add NoGradGuard to libtorch execution (#263)
@gkorland gkorland deleted the nograd-guard branch October 6, 2020 08:57
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant