awslabs / multi-model-server

Multi Model Server is a tool for serving neural net models for inference

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is it possible to implement multi-stage inferences?

bedilbek opened this issue · comments

Lets say I want to create an inference server for human pose estimation which consists of 2 stages, human proposal stage for human detection and pose proposal stage for pose estimation upon given human proposals. So, how it is possible to create one endpoint with multi-stage inference?

Actually, it was just a matter of understanding the repository :) I could implement a demo version as described here on medium