Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

small difference between gdbt-rs and rust-xgboost(native) #11

Open
jondot opened this issue Mar 1, 2021 · 0 comments
Open

small difference between gdbt-rs and rust-xgboost(native) #11

jondot opened this issue Mar 1, 2021 · 0 comments

Comments

@jondot
Copy link

jondot commented Mar 1, 2021

Hi,

I'm experiencing small delta between prediction (same model, same inputs), of gdbt-rs and rust-xgboost, using xbtree and logistic regression, (https://github.com/davechallis/rust-xgboost) which is based on the C++ implementation.

I'm researching this at the moment and suspect a few causers:

  1. floating point precision differences native to C++ vs Rust
  2. different XGB implementation
  3. I'm training on python and loading into Rust via the convert script -- so maybe a problem in reading the dump on the Rust side (I assume the save side is OK because its using the C++ lib)

From your experience is this a known issue? or maybe you can point me into a more specific direction to research from what I listed above?

Thanks

UPDATE:
I have now narrowed it down to initializing parameters on the Python side vs Rust side. Looks like some of the parameters are not loaded or taking into account differently. When both models in Python and Rust sides are loaded with no parameters - results are equal.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant