tensorflow / serving

A flexible, high-performance serving system for machine learning models

Home Page:https://www.tensorflow.org/serving

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

No unified interface for calling when model's input represented by dict. Different number of parameters change way of calling.

IgorHoholko opened this issue · comments

commented

Bug Report

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 22 LTS
  • TensorFlow Serving installed from (source or binary): docker pull tensorflow/serving
  • TensorFlow Serving version: 2.11.0

Describe the problem

I have two models. They have a dict for input:

def __call__(self, inputs: dict):
   ...

# model 1
# input is { "scene__id": ['1', '2']  }

model1.signatures['serving_default']
>>> <ConcreteFunction signature_wrapper(*, scene__id) at 0x7F355059DD60>

# model 2
# input is { "scene__id": ['1', '2'] , "title": ['1', '2'] }

model2.signatures['serving_default']
>>> <ConcreteFunction signature_wrapper(*, scene__id, title) at 0x7F355059DD60>

I serve them both in following way:

docker run -t --rm -p 8502:8501     -v "/home/oem/Projects/project/candidate_model/:/models/candidate_model"     -e MODEL_NAME=candidate_model     tensorflow/serving &

The problem is that I can't call these 2 in the same way.

	import requests
	import json

    url = "http://localhost:8501/v1/models/model{}:predict"

    # Define the request headers
    headers = {"content-type": "application/json"}
    
    # Works for model 1, Doesn't work for model 2
    input_data = {
        "instances": ['2','2']
    }
    
    # Doesn't work for model 1 ( [ {"scene__id": '2'} ]). Works for model 2.
    input_data = {
        "instances": 
            [ {"scene__id": '2', "title": '2'} ]
    }
    
    response = requests.post(url, data=json.dumps(input_data), headers=headers)
    

Hi, @IgorHoholko

Apologize for the delayed response and you're Specifying input tensors in row format and Use this format if all named input tensors have the same 0-th dimension. If they don't, use the columnar format, I would suggest you to please have look into these approaches 1. Specifying input tensors in row format and 2. Specifying input tensors in column format and let us know whether is it resolving your issue or not ?

I completely understood your point here that while passing different numbers of parameters the way of calling is different because for single feature we can pass values in list but when we are having feature more than one then we have to do mapping like key:value pair that's the reason the way of calling with input data for more than one feature is different

If issue still persists please let us know ? Thank you!

commented

Hello @gaikwadrahul8 ,

Thanks for your answer!
So if I got properly the solution like this is ok here?

if len(arguments) == 1:
	# Form input in one format.
else:
	# Form input in another format.

# Make request

Hi, @IgorHoholko

It's looking good for me unless and until your both models are working as expected. Have you tried that approach? Thank you!

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

This issue was closed due to lack of activity after being marked stale for past 7 days.