How can I use tensorflow serving for multiple models -


how can use multiple tensorflow models? use docker container.

model_config_list: {    config: {     name: "model1",     base_path: "/tmp/model",     model_platform: "tensorflow"   },   config: {      name: "model2",      base_path: "/tmp/model2",      model_platform: "tensorflow"   } } 

built docker image official tensorflow serving docker file

then inside docker image.

/usr/local/bin/tensorflow_model_server --port=9000 --model_name=model_name --model_config_file=/serving/models.conf 

here /serving/models.conf similar file yours.


Comments

Popular posts from this blog

Is there a better way to structure post methods in Class Based Views -

performance - Why is XCHG reg, reg a 3 micro-op instruction on modern Intel architectures? -

jquery - Responsive Navbar with Sub Navbar -