-
Notifications
You must be signed in to change notification settings - Fork 43
Conversation
To test you can run: from mlem.api import serve
from mlem.contrib.streamlit.server import StreamlitServer
from mlem.core.metadata import save
def main():
save(lambda x: x + 1, "mdl2", sample_data=0)
serve("mdl2", StreamlitServer())
if __name__ == '__main__':
main() |
It was quite a journey, but I finally ran streamlit+pytorch in docker with images Some stuff I found out: mac m1 is aarch64, and tochvision binaries on pypi do not have I tried building amd64 image, but |
Codecov ReportBase: 87.24% // Head: 86.70% // Decreases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## main #512 +/- ##
==========================================
- Coverage 87.24% 86.70% -0.55%
==========================================
Files 97 99 +2
Lines 8821 8948 +127
==========================================
+ Hits 7696 7758 +62
- Misses 1125 1190 +65
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
Trying your example: In [1]: from mlem.api import serve
...: from mlem.contrib.streamlit.server import StreamlitServer
...: from mlem.core.metadata import save
...:
...:
...: def main():
...: save(lambda x: x + 1, "mdl2", sample_data=0)
...: serve("mdl2", StreamlitServer())
...:
...:
...: if __name__ == '__main__':
...: main()
...:
INFO: Started server process [42653]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO: 127.0.0.1:58074 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:58074 - "GET /docs HTTP/1.1" 200 OK
INFO: 127.0.0.1:58074 - "GET /openapi.json HTTP/1.1" 200 OK
You can now view your Streamlit app in your browser.
URL: http://0.0.0.0:80
For better performance, install the Watchdog module:
$ xcode-select --install
$ pip install watchdog
INFO: 127.0.0.1:58089 - "GET /interface.json HTTP/1.1" 200 OK
INFO: 127.0.0.1:58090 - "GET /interface.json HTTP/1.1" 200 OK
2022-12-09 20:48:06.914 Uncaught app exception
Traceback (most recent call last):
File "/Users/aguschin/.local/share/virtualenvs/mlem-Utz6DvOn/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 564, in _run_script
exec(code, module.__dict__)
File "/private/var/folders/tv/l60j0x050p536g3bh8g2w1n80000gn/T/mlem_streamlit_script_psg1c3au/script.py", line 40, in <module>
augment, arg_model_aug = augment_model(arg_model)
File "/Users/aguschin/Git/iterative/mlem/mlem/contrib/streamlit/server.py", line 35, in augment_model
for name, f in model.__fields__.items()
AttributeError: type object 'int' has no attribute '__fields__' |
Got few issues looking the same with different models, will re-check them once you fix this :) |
When request via streamlit fails, it prints something like
but underneath, FastAPI fails with a different error. This is strange, maybe we eventually will need to either redirect failures, or use MLEM model directly instead of forwarding this to FastAPI, which seems more reasonable to me. For now should work though :) |
TODOs: