Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

fix: deprecation warning in sklearn.linear_model.LassoLarsIC #2528 #22

Merged
merged 1 commit into from
May 12, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion shap/explainers/_kernel.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@
import warnings
import gc
from sklearn.linear_model import LassoLarsIC, Lasso, lars_path
from sklearn.pipeline import make_pipeline
from sklearn.preprocessing import StandardScaler
from tqdm.auto import tqdm
from ._explainer import Explainer

Expand Down Expand Up @@ -562,7 +564,8 @@ def solve(self, fraction_evaluated, dim):
# use an adaptive regularization method
elif self.l1_reg == "auto" or self.l1_reg == "bic" or self.l1_reg == "aic":
c = "aic" if self.l1_reg == "auto" else self.l1_reg
nonzero_inds = np.nonzero(LassoLarsIC(criterion=c).fit(mask_aug, eyAdj_aug).coef_)[0]
model = make_pipeline(StandardScaler(with_mean=False), LassoLarsIC(criterion=c, normalize=False))
nonzero_inds = np.nonzero(model.fit(mask_aug, eyAdj_aug)[1].coef_)[0]

# use a fixed regularization coeffcient
else:
Expand Down