tensorflow / serving

A flexible, high-performance serving system for machine learning models

Home Page:https://www.tensorflow.org/serving

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add option to install tensorflow-serving-api without Tensorflow

jdholtz opened this issue · comments

Feature Request

Describe the problem the feature is intended to solve

Currently, tensorflow-serving-api requires Tensorflow as a dependency. However, Tensorflow is not needed if you are only
using the config module (at least it isn't needed for ModelServerConfig, but I haven't tested with other configurations). This
means that users are still forced to unnecessarily download Tensorflow to just use the config module, increasing installation
times and the size of their installation (can take about 3-8 minutes and over 1GB of space).

Describe the solution

It would be great to have a way to only install the config part of tensorflow-serving-api (or not require Tensorflow when
installing the base package, although this doesn't seem like a good option). I'm not too familiar with setuptools, but this
could probably work similar to how tensorflow-serving-api-gpu uses the GPU Tensorflow package (but it would instead remove Tensorflow altogether). The package could be called tensorflow-serving-api-config.

Describe alternatives you've considered

I've looked into specifying dependencies to exclude from installation using pip, but there doesn't appear to be an option to do so.
Additionally, I could install all the requirements for my application by specifying every dependency and using the --no-deps flag for Pip, but that would require me to continuously manage every single dependency my project has (I would rather let Pip handle this).

Since I am using a Docker image to package my application, I could also install tensorflow-serving-api and then proceed to remove Tensorflow before the build is finished. However, this unnecessarily wastes time and bandwidth.

Additional context

@jdholtz,

Tensorflow prediction APIs are defined as protobufs. Instead of loading Tensorflow and TF Serving depnedency, you can replace them by generating the necessary tensorflow and tensorflow_serving protobuf python stubs. This avoids the need the pull in the entire (heavy) Tensorflow library on the client itself. You can refer this article to implement this.

Also, I see a similar feature request #1450 raised previously, so I would suggest you to +1 similar issue and follow it for updates and close this issue. Thank you!

Thanks for the response. I will definitely check out .proto files.