Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

model.feature_importances_ don't change ... #32

Open
psaks opened this issue Nov 23, 2019 · 0 comments
Open

model.feature_importances_ don't change ... #32

psaks opened this issue Nov 23, 2019 · 0 comments

Comments

@psaks
Copy link

psaks commented Nov 23, 2019

It doesn't seem that feature importances change. Using "lightgbm==2.3.0" I get the following;

`xval, yval = make_classification(n_samples = 1000, n_features=10)
model = lgb.LGBMClassifier(n_estimators=100, learning_rate = 0.05, verbose = -1)

for i in range(10):
model.fit(xval, yval)
print(model.feature_importances_)
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]
[244 537 213 214 183 222 282 264 175 648]`

If this is "correct" LightGBM behaviour, then there is obviously no need to average the feature_importances_ over multiple iterations.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant