-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Could you help me with adding multiple models to keras-serving #2
Comments
Hi jerminal, I would like to help you on this. Are you talking about changing the model's feature vectors (input) and breaking the dimension apis in GRPC nodejs requests? I will need a description of the input vector of your network to help you. |
Or do you just want to add multiple versions of a model to the same serving-server? |
I have both problems. I need to classify images (so to change the input vector). And I have multiple keras models that I want to apply to each image. So maybe you can add me to skype: jermuk_skype and I'll describe it with more details? |
Or maybe you can tell me your skype or email? |
For example to server models for face recognition. |
Hi @jerminal I will try to setup an example using the face_classification soon. Serving multiple versions of a model at once should be no problem, you should only have to name the exports differently and send the version as paramter to the grpc service during requests. |
Oh. Great. But is it possible to check 1 image for several models at 1 request to docker server? And about support - ok, no problem. You can add me to twitter or anything else. And I can pay for such improvements of your model if you'd like. |
I would probably suggest to load multiple versions of the model into the serving-server and then use the nodejs service to make parallel requests to both/multiple models at the same time, its very easy to do this in node. Il try to setup the face-recog. example with such a parallel request endpoint. Haha cheers! Il pm you on twitter, when I am working on it ;) |
Actually the question here is, are you actually using the node.js service simplified API? If yes (thats cool) how many requests are u going to run against it? (You could scale the services to unlimited instances,, but its still a valid question if you are trying to predict in realtime using web-services) |
I have about 10-15 requests per minute. It's not very much :) |
And yes - I use the node.js service simplified API. |
Cool, just hang on ;) |
👍 |
So I have already pushed my current state into the remote master, but I am not done yet. |
I have created a sub project as directory: https://github.com/krystianity/keras-serving/tree/master/face-recog |
Thanks. Will be waiting for this very much!!! |
@jerminal alright. It is up an running :) Let me know if you need or want to try something else. |
Hi. Thanks a lot!!! I'll check it in the evening. And of course I've sent drop for a beer for you. And promise when I come to Germany again, I'll to treat you beer personally :) |
Oh.. I have a strange problem. When follow the instructions: at p. 4 when I run: "npm start ./../images/cutouts/4.png" I got strange error :(
|
No worries, your running node 5.10; (with npm 5.0.3 o.O) Can you upgrade you node version to at least 7.10 ? |
If its not possible for you to update node on the root system, I can wrap the client in a Docker container as well :) |
Updated node. Now strange thing with running
|
After upgrading nodejs, did you delete the node_modules folder in the face-recog/client dir and ran "npm install again" ? (just to make sure) Additionally, I have a feeling that the models you servers are serving arent correct e.g. cached. To understand whats going wrong I need you to send me the error that is logged by the server, when your client retrieves the error. |
Additionally. |
Oh. I'll check it now on different computer. A few hours and will write back. |
I've checked. The same problem. Got the same error.
|
And yes, I execute via python 2.7.
Docker version 17.04.0-ce, build 4845c56
Yes, I've done it above. Hope it will help you. |
Alright, the versions look fine, the output is strange as it relates to the dimensions defined for the interface. I have added additional logging to the node-server and the client, could please run the following and
|
Here log:
And request:
|
Strange :/ On which OS are you running this? MacOSX? |
Okay so I have spun up a clean Ubuntu 16.04 LTS VM and ran through the setup as defined in the READMEs. I have found 2 potential errors regarding the naming of the input vectors of the gender model as well as the output vectors, its possibly related to some strange docker-compose caching behavior that I wasnt able to see these locally. The bugs are fixed in the current master, so please go ahead and run:
again :) |
Cool. Now works! |
Awesome :) Let me know if you want to take this in another direction. I am currently trying to:
|
Maybe some preprocessing images. For example, I send big ".jpg" image and it reshapes it and responses with the output of models. And also maybe more details how to add your own models and make it work. |
Could you help me with adding multiple models for keras-serving ? Just reply we can discuss it via email or skype.
The text was updated successfully, but these errors were encountered: