-
Notifications
You must be signed in to change notification settings - Fork 513
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Reduce unnecessary reliance and usage of numpy #746
Comments
Excellent find, @aobo-y! Thank you for bringing up this good proposal! |
Thank you for the heads up @bilalsal ! |
@aobo-y can we close this issue since it has been addressed ? |
@NarineK sure, I think we almost cleared the unnecessary cases. However, there are still some tricky ones left. For example, captum/captum/attr/_core/gradient_shap.py Lines 359 to 363 in 4faf1ea
is perfect scenario to directly apply torch.rand(...) . But the change will require quite some extra efforts to fix many tests, because even with fixed seed during tests, torch 's random gives different values from np . As you can imagine, it needs update of the expected values and relaxation in many "almost-equal" asserts.I will create separate issues dedicated for them. |
This search gives us all the usage of
Numpy
in Captum https://github.com/pytorch/captum/search?p=1&q=numpyMany of them are not necessary
In order to reduce the unneeded dependency, we can investigate the current usages and remove/replace them when possible
The text was updated successfully, but these errors were encountered: