microsoft / nn-Meter

A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Refine setup process

mydmdm opened this issue · comments

  • manage default devices configuration
  • clarify / check dependencies
    • my suggestion is to exclude frameworks (e.g. tensorflow / pytorch / nni) from installation dependencies (because user may only use one of them)
    • check the framework installation and version on demand when user imports models
  • add entry_point to support command line usage
  • downloaded predictors (data / test data) to ~/.nn_meter/data
  • add --verbose to print debug-level information (e.g., the divided kernel information)
    • refine logging level
    • use two buffer to hold logging
  • add --tersorflow etc to specify model type
  • try model compression with gzip
  • Specify parameter type in the function declaration
  • change nn_meter/utils/graphe_tool.py to graph_tool.py and Graph class
    • change all graphe to graph and check its using.
  • add requirements in setup.py
  • code testing
    • .pb model
    • .onnx model
    • link an action on GitHub with .yml
    • setting cache policy to avoid redundancy downloading in test (Leaving to the next PR)