Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to serve multiple versions of model via standard tensorflow serving docker image?

I'm new to Tensorflow serving,

I just tried Tensorflow serving via docker with this tutorial and succeeded.

However, when I tried it with multiple versions, it serves only the latest version.

Is it possible to do that? Or do I need to try something different?

like image 589
genki98 Avatar asked Jan 26 '26 16:01

genki98


1 Answers

This require a ModelServerConfig, which will be supported by the next docker image tensorflow/serving release 1.11.0 (available since 5. Okt 2018). Until then, you can create your own docker image, or use tensorflow/serving:nightly or tensorflow/serving:1.11.0-rc0 as stated here. See that thread for how to implement multiple models.

If you on the other hand want to enable multiple versions of a single model, you can use the following config file called "models.config":

model_config_list: {
    config: {
        name: "my_model",
        base_path: "/models/my_model",
        model_platform: "tensorflow",
        model_version_policy: {
            all: {}
        }
    }
}

here "model_version_policy: {all:{ } }" make every versions of the model available. Then run the docker:

docker run -p 8500:8500 8501:8501 \
    --mount type=bind,source=/path/to/my_model/,target=/models/my_model \
    --mount type=bind,source=/path/to/my/models.config,target=/models/models.config \
    -t tensorflow/serving:nightly --model_config_file=/models/models.config

Edit:
Now that version 1.11.0 is available, you can start by pulling the new image:

docker pull tensorflow/serving

Then run the docker image as above, using tensorflow/serving instead of tensorflow/serving:nightly.

like image 83
KrisR89 Avatar answered Jan 29 '26 11:01

KrisR89



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!