Skip to content
This repository was archived by the owner on Sep 13, 2023. It is now read-only.

Incorrect model serving with --standardize False #593

Closed
aguschin opened this issue Jan 25, 2023 · 2 comments
Closed

Incorrect model serving with --standardize False #593

aguschin opened this issue Jan 25, 2023 · 2 comments
Assignees
Labels
bug Something isn't working serve Serving models

Comments

@aguschin
Copy link
Contributor

After #588 it appears that models are sometimes served incorrectly:

def hash_text(text, salt="SALT"):
    return hash(text + salt)


from mlem.api import save

save(hash_text, "hash_model", sample_data="sometext")

then

$ mlem serve streamlit -m hash_model --standardize False
⏳️ Loading model from hash_model.mlem
Starting streamlit server...
🖇️  Adding route for /__call__
Checkout openapi docs at <http://0.0.0.0:8080/docs>
...

and I see
image
the problem: it's the wrong input to be expected from the model. It should be text. If you input something and click on "Submit", it fails.
In FastAPI:
image

Interestingly, it works with --standardize True:
image

@aguschin aguschin added bug Something isn't working serve Serving models labels Jan 25, 2023
@mike0sv
Copy link
Contributor

mike0sv commented Feb 2, 2023

This should be fixed by #595, please check

@aguschin
Copy link
Contributor Author

aguschin commented Feb 8, 2023

yep! thanks!

# for free to subscribe to this conversation on GitHub. Already have an account? #.
Labels
bug Something isn't working serve Serving models
Projects
No open projects
Status: Done
Development

No branches or pull requests

2 participants