-
Notifications
You must be signed in to change notification settings - Fork 437
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Optional --output_path param & Async/await support #502
Comments
I'm assuming you're using the fetch pattern from papermill? Otherwise why not use nbclient / nbconvert directly if you're converting? Are you executing and converting to html or just converting to html? If you're just converting to html I'm not sure why you need papermill or nbclient -- sorry for the confusion here on my part.
Set the output to /dev/null, but generally papermill is a high opinion tool so it always requires an input and an output.
Yes, we just haven't gotten around to adding async to papermill now that the stack above is async. One thing to note is that our IO fetch methods rely on external libraries that are mostly NOT async so even if execution is made async fetching / saving notebooks may not always be. |
I receive |
Got it, makes sense. In that case I'd just use a tmp path based on session id (or randomly) as the output and clean the output on successful request termination. On the async front, happy to review PRs if you wanted to help add async support :) |
Hi, @MSeal I'm having issues executing the sparkmagic PySpark kernel while testing the async changes on this PR. The execution hangs on the first cell (see logs below). Do you know if this related to adding async support to papermill? If so, happy to try to make a contribution. Using selector: EpollSelector
Starting kernel (async): ['/usr/bin/python3.7', '-m', 'sparkmagic.kernels.pysparkkernel.pysparkkernel', '-f', '/tmp/tmpk5lukfi7.json']
Connecting to: tcp://127.0.0.1:34727
connecting iopub channel to tcp://127.0.0.1:57569
Connecting to: tcp://127.0.0.1:57569
connecting shell channel to tcp://127.0.0.1:55435
Connecting to: tcp://127.0.0.1:55435
connecting stdin channel to tcp://127.0.0.1:47223
Connecting to: tcp://127.0.0.1:47223
connecting heartbeat channel to tcp://127.0.0.1:58479
Using selector: EpollSelector
connecting control channel to tcp://127.0.0.1:34727
Connecting to: tcp://127.0.0.1:34727
Executing notebook with kernel: pysparkkernel
Executing Cell 1---------------------------------------
Skipping non-executing cell 0
Ending Cell 1------------------------------------------
Executing Cell 2---------------------------------------
Skipping non-executing cell 1
Ending Cell 2------------------------------------------
Executing Cell 3---------------------------------------
Executing cell:
%%info
msg_type: status
content: {'execution_state': 'busy'}
msg_type: execute_input
content: {'code': '%%info', 'execution_count': 1}
msg_type: status |
Hi, I'm using papermill for high load microservice to display html converted notebook in real time.
How can I skip & disable creating a output notebook?
async_execute_notebook
with new nbclient APIasync_setup_kernel
,async_wait_for_reply
,async_execute_cell
?The text was updated successfully, but these errors were encountered: