You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to figure out if it's possible to export the estimator for serving inputs, and it's interesting that you have a gen_input_fn function instead of just an input_fn when predicting. Is the logic for the gen_input_fn part of the nmt repo or did you write it yourself? Asking because I can't figure out how to implement the serving_input_fn function for this model.
In another hope to get it right, do you maybe know how I could get the input tensor and output tensor info?
The text was updated successfully, but these errors were encountered:
On Fri, Aug 17, 2018 at 5:35 PM MarthinusBosman ***@***.***> wrote:
I'm trying to figure out if it's possible to export the estimator for
serving inputs, and it's interesting that you have a gen_input_fn
function instead of just an input_fn when predicting. Do you maybe know
how I could get the input tensor and output tensor info?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#5>, or mute the
thread
<https://github.com/notifications/unsubscribe-auth/AOUFIlqA-Jj6Z3Rrg-QgptsvDl6L-qbNks5uRvDmgaJpZM4WB214>
.
I'm trying to figure out if it's possible to export the estimator for serving inputs, and it's interesting that you have a
gen_input_fn
function instead of just aninput_fn
when predicting. Is the logic for the gen_input_fn part of the nmt repo or did you write it yourself? Asking because I can't figure out how to implement theserving_input_fn
function for this model.In another hope to get it right, do you maybe know how I could get the input tensor and output tensor info?
The text was updated successfully, but these errors were encountered: