Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

NaNs cause hanging issue while training #67

Closed
hexgnu opened this issue Sep 27, 2019 · 2 comments
Closed

NaNs cause hanging issue while training #67

hexgnu opened this issue Sep 27, 2019 · 2 comments
Labels
bug Something isn't working

Comments

@hexgnu
Copy link

hexgnu commented Sep 27, 2019

Hey there, thank you for this awesome project!

At first I thought this was an issue with joblib but found out it had to do with NaNs in my dataframe.

Problem: when I try and train using NaNs the job basically just hangs and is pegging only one core. It seems like a better case would be to return an error or something because I had no idea it was just hanging since #7 isn't implemented yet.

image

I am running Fedora 30 with python 3.7.3.

Perhaps a simple check for NaNs in the array would be useful as an error to prevent this from happening? Lemme know how I can help and thank you! Great speech at Strata btw

Thank you

-Matt

@interpret-ml
Copy link
Collaborator

Hi @hexgnu - glad you enjoyed the talk at Strata, we had a blast!

Thanks for reporting this. This is a bug on our part, we had some NA handling before release but decided to pull it.

You're welcome to submit a pull request throwing an exception on NaNs. Otherwise, we'll produce a fix on this around end of week.

@interpret-ml interpret-ml added the bug Something isn't working label Oct 3, 2019
@interpret-ml
Copy link
Collaborator

Hi @hexgnu - This issue has been fixed in our latest 0.1.19 release. Thanks again for reporting this to us.

  • InterpretML Team

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working
Development

No branches or pull requests

2 participants