tensorflow / serving

A flexible, high-performance serving system for machine learning models

Home Page:https://www.tensorflow.org/serving

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support for TensorFlow pluggable devices

pksubbarao opened this issue · comments

Describe the problem the feature is intended to solve

TensorFlow's pluggable device architecture offers a plugin mechanism for registering devices with TensorFlow without the need to make changes in TensorFlow code. It provides a set of C API as an ABI-stable way to register a custom device runtime, kernels/ops, graph optimizer and profiler.

With this, developing support for 3rd party custom devices in TensorFlow is greatly simplified. However, its not clear if these plugins can work with TF-Serving. I can find documentations for serving TensorFlow models with custom ops by copying over source into Serving project and building static library for the op. However I couldn't find anything for custom device nor pluggable device for TFServing.

I would appreciate any documentation or instructions for Serving with custom/pluggable 3rd party devices. If this is not currently supported, any information on plans for future support would be helpful.

Thanks

Describe the solution

Pluggable device to be compatible with TFServing

Describe alternatives you've considered

Considered custom ops that could be used to define ops/kernels but lacks graph optimization and memory management.

Additional context

Add any other context or screenshots about the feature request here.

Bug Report

If this is a bug report, please fill out the following form in full:

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 22.04
  • TensorFlow Serving installed from (source or binary): Source
  • TensorFlow Serving version: 2.7

Hello... Any update on this issue? Thanks