tensorflow / serving

A flexible, high-performance serving system for machine learning models

Home Page:https://www.tensorflow.org/serving

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support Python handlers for non Tensorflow pre and post-processing like TorchServe

piEsposito opened this issue · comments

Feature Request

If this is a feature request, please fill out the following form in full:

Describe the problem the feature is intended to solve

I'm always frustrated when I need to perform some non Tensorflow pre or post processing operation on my data and need to create an intermediary service or worker to handle usage of TF Serving. That gets even worse when my software is not pure Python, and I need to create a Python service to do stuff like tokenize a sentence using Hugging Face Tokenizers.

Describe the solution

It would be great if, alike to what is done with TorchServe, I could create a Python file (and define some dependencies), attaching it to the model bundle to, in the TF Serving server, perform those operation, saving me time from writing more code and managing more infrastructure.

Describe alternatives you've considered

  • Use NVidia Triton
  • Migrate my code to PyTorch and use TorchServe
  • Create intermediary Python services or workers to handle those processing operations

Additional context

This is how TorchServe does it: https://pytorch.org/serve/custom_service.html

@piEsposito,

Similar feature request #663 is in progress. Requesting you to close this issue, follow and comment on #663 thread for updates on this feature implementation. Thank you!

This issue has been marked stale because it has no recent activity since 14 days. It will be closed if no further activity occurs. Thank you.

This issue was closed due to lack of activity after being marked stale for past 7 days.