How can I use tensorflow serving for multiple models -


how can use multiple tensorflow models? use docker container.

model_config_list: {    config: {     name: "model1",     base_path: "/tmp/model",     model_platform: "tensorflow"   },   config: {      name: "model2",      base_path: "/tmp/model2",      model_platform: "tensorflow"   } } 

built docker image official tensorflow serving docker file

then inside docker image.

/usr/local/bin/tensorflow_model_server --port=9000 --model_name=model_name --model_config_file=/serving/models.conf 

here /serving/models.conf similar file yours.


Comments

Popular posts from this blog

Is there a better way to structure post methods in Class Based Views -

reflection - How to access the object-members of an object declaration in kotlin -

php - Doctrine Query Builder Error on Join: [Syntax Error] line 0, col 87: Error: Expected Literal, got 'JOIN' -