tensorflow / serving

A flexible, high-performance serving system for machine learning models

Home Page:https://www.tensorflow.org/serving

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Create special docker images for AVX2/FMA et al support, with special tags

kokroo opened this issue · comments

Feature Request

Describe the problem the feature is intended to solve

Tensorflow Serving should have docker images with AVX2 and FMA et al enabled, with different tags so that people who know what they are and need them can use them. A lot of users and cloud providers do have capable CPUs now, and although I understand the default build needs to be widely compatible, it is also important to provide power users with images that can speed up their performance.

Describe the solution

Include docker images in the CI/CD pipeline with these CPU extensions enabled, so that people who are aware of their CPU capabilities can intentionally use the specially-tagged docker images and use them.

Describe alternatives you've considered

An alternative solution is rolling your own fork but that is not desirable nor does every data scientist have these skills.

@kokroo,

You can try building Tensorflow Serving from source and if you are interested in optimized builds you can follow optimize build section to utilize platform-specific instruction sets for your processor. It is also possible to compile using specific instruction sets (e.g. AVX, AVX2. FMA).

Thank you!

@singhniraj08 Hello,

I can definitely do that, but for most people, it is not convenient. A lot of people are just data scientists using a docker image, and it would be really convenient if tensorflow publishes docker images with special tags, which are already compiled to utilize those specific CPU instructions.

commented

Try building Tensorflow Serving from source failed. it would be really convenient if tensorflow publishes docker images with special tags (e.g. AVX, AVX2. FMA).

great idea!