How can I use tensorflow serving for multiple models -
how can use multiple tensorflow models?  use docker container.
model_config_list: {    config: {     name: "model1",     base_path: "/tmp/model",     model_platform: "tensorflow"   },   config: {      name: "model2",      base_path: "/tmp/model2",      model_platform: "tensorflow"   } } 
built docker image official tensorflow serving docker file
then inside docker image.
/usr/local/bin/tensorflow_model_server --port=9000 --model_name=model_name --model_config_file=/serving/models.conf here /serving/models.conf similar file yours.
Comments
Post a Comment