ServiceNow / multithreaded-estimators

Multithreading inference in Tensorflow Estimators. This is a ServiceNow Research project that was started at Element AI.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ServiceNow completed its acquisition of Element AI on January 8, 2021. All references to Element AI in the materials that are part of this project should refer to ServiceNow.

Multithreaded-estimators

Code demonstrating how to use multithreading to speedup inference for Tensorflow estimators.

Installation

A Dockerfile is provided. First build the image from the root directory:

docker build . -t threaded

Then run the tests:

docker run threaded

License

This code is released under an Apache 2 license. See the license in full.

About

Multithreading inference in Tensorflow Estimators. This is a ServiceNow Research project that was started at Element AI.

License:Apache License 2.0


Languages

Language:Python 98.0%Language:Dockerfile 2.0%