Lijiadong / modelstat

๐Ÿ“๏ธ modelstat is CLI tool to summarize the Machine Learning model layer inference time.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

modelstat

modelstat is CLI tool to summarize the Machine Learning model layer inference time. It is base on Nvidia-dlprof, you can visit document to get more information. The result got from modelstat is similar to Pyprof - PyTorch Profiling tool, but on June 30th 2021, NVIDIA will no longer make contributions to the PyProf repository. If you want to observe the result from Pyprof, refer to [example/pyprof].

  • support:
    • Pytorch
    • Tensorrt
    • Tensorflow (TBD)

CLI command

modelstat provide following function:

  1. show GPU information: detect your GPU index and name, it is useful when you need choose a GPU to do inference task.
  2. profile: summarize source used by every layer in Machine Learning model, e.q. duration, API call time, etc

usage:
modelstat [-h] [-c COMMAND] [-t {pytorch,tensorrt}] [-o OUTPUT_FILENAME] [-f {json,csv}] [-g GPU_IDX] [-l] function

show GPU information

modelstat gpuinfo

profile

modelstat profile -t <model_type>

short long possible parameters default description
-t --model_type pytorch, tensorrt - choose model type
-o --output - layer_inference_result set summary file name
-f --format json,csv write to file by simple format set summary file format
-g --gpu - 0 set up gpu index to do inference
-l --log - True determine to delete log files or not,
if specific -l will keep log;
Otherwise, delete all log files

Docker

  • build:

    $ docker build -f Dockerfile -t <image>:<tag> .
    
  • run:

    • It is important to write --gpu all for exposing GPU information to docker container!
    $ docker run -it --gpus all <image>:<tag>
    

Example

I porvide some examples how to use this tool to sumamrize layer information in Machine Learning Model. The all example under example folder and exist shell script to automate install required Machine Learning model and do inference task, finally write layer information to file.

steps:

  1. clone the repo
  2. using dockerfile to build image and start up.
  3. change directory to example you want, e.q. cd exmaple/tensorrt/resnet50 to understand how to use the tool for Tensorrt engine.
  4. every example has executable run.sh, just run ./run.sh you can see the result.

Dependency

About

๐Ÿ“๏ธ modelstat is CLI tool to summarize the Machine Learning model layer inference time.

License:MIT License


Languages

Language:Python 91.7%Language:Shell 5.0%Language:Dockerfile 3.3%