You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 13, 2023. It is now read-only.
$ mlem serve streamlit -m hash_model --standardize False
⏳️ Loading model from hash_model.mlem
Starting streamlit server...
🖇️ Adding route for /__call__
Checkout openapi docs at <http://0.0.0.0:8080/docs>
...
and I see
the problem: it's the wrong input to be expected from the model. It should be text. If you input something and click on "Submit", it fails.
In FastAPI:
Interestingly, it works with --standardize True:
The text was updated successfully, but these errors were encountered:
After #588 it appears that models are sometimes served incorrectly:
then
and I see


the problem: it's the wrong input to be expected from the model. It should be
text
. If you input something and click on "Submit", it fails.In FastAPI:
Interestingly, it works with

--standardize True
:The text was updated successfully, but these errors were encountered: