Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Exporting model for serving #5

Open
MarthinusBosman opened this issue Aug 17, 2018 · 1 comment
Open

Exporting model for serving #5

MarthinusBosman opened this issue Aug 17, 2018 · 1 comment

Comments

@MarthinusBosman
Copy link

MarthinusBosman commented Aug 17, 2018

I'm trying to figure out if it's possible to export the estimator for serving inputs, and it's interesting that you have a gen_input_fn function instead of just an input_fn when predicting. Is the logic for the gen_input_fn part of the nmt repo or did you write it yourself? Asking because I can't figure out how to implement the serving_input_fn function for this model.

In another hope to get it right, do you maybe know how I could get the input tensor and output tensor info?

@davidhughhenrymack
Copy link
Contributor

davidhughhenrymack commented Aug 18, 2018 via email

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants