-
Notifications
You must be signed in to change notification settings - Fork 512
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Relu missing in implementation of LayerGradCam.attribute() #179
Comments
@vipinpillai, we did it on purpose in order to be more flexible and show negative attribution as well. See doc: https://github.com/pytorch/captum/blob/master/captum/attr/_core/layer/grad_cam.py#L47 |
Thanks @NarineK. I got confused because of the return variable being named non_neg_scaled_act. I think it might be confusing for anyone expecting a GradCAM visualization from this API without explicitly applying relu. Wouldn't it be better to have an argument specifying whether the caller needs a relu / non-relu version of the attribution? |
yeah, we could add the flag. cc: @vivekmig |
Hi @vipinpillai, thanks for the feedback, adding the argument here #181. |
Thanks @vivekmig |
Closing this since it got fixed by: #181 |
I was looking at the code in grad_cam.py for LayerGradCam.attribute() and noticed that there is no Relu operation applied after computing the summed_grads * layer_eval here: https://github.com/pytorch/captum/blob/master/captum/attr/_core/layer/grad_cam.py#L177
The text was updated successfully, but these errors were encountered: