-
Notifications
You must be signed in to change notification settings - Fork 254
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Fix interventional TreeSHAP vs KernelSHAP xgboost example #703
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Codecov Report
@@ Coverage Diff @@
## master #703 +/- ##
=======================================
Coverage 80.55% 80.55%
=======================================
Files 105 105
Lines 11790 11790
=======================================
Hits 9497 9497
Misses 2293 2293 |
View / edit / reply to this conversation on ReviewNB jklaise commented on 2022-06-21T12:37:09Z I would rephrase and remove "Unfortunately" and add "the upstream implementation..." so it's clear the issue primarily lies with the |
View / edit / reply to this conversation on ReviewNB jklaise commented on 2022-06-21T12:37:10Z Can you remind me what the reason for this not holding originally was? RobertSamoilescu commented on 2022-06-21T15:37:25Z I believe was a previous issue that was fixed. |
View / edit / reply to this conversation on ReviewNB jklaise commented on 2022-06-21T12:37:11Z Good to add the reason to this comment, i.e. something about the maximum number of enumerable subsets. |
I believe was a previous issue that was fixed. View entire conversation on ReviewNB |
This PR fixes the issue #485 related to the inconsistency between the interventional TreeSHAP vs KernelSHAP for xgboost example. In addition, it fixes the local accuracy property which now holds.
The issue seem to have been caused by a limitation/issue/bug of interventional TreeSHAP which works properly with only up to 100 instances in the background dataset. This issue has been reported here and here.