Refine setup process 2
JiahangXu opened this issue · comments
- support batch mode in cmd line auch as
nn-meter --onnx <file-or-folder> --predictor <predictor-name>
- in code:
predictor-name
; in doc:hardware
; in readme:device + inference framework
- refine integration-test by batch mode
- in code:
- complete IR model test
- install Ubuntu to support nni development and install the latest nni package
- reset python env in Ubuntu
- nni install
- test nni
multi-trail.py
in nni
- fix nni-ir model in nn-meter
- refine readme.md and give an instruction of test
- report an issue in nni found when testing nn-Meter
- add the integration test of nni-ir graph (found that the current nni does not support nn-Meter module, maybe we should waiting for the next release of nni)
- install Ubuntu to support nni development and install the latest nni package
- refine the API and model type list in nn-meter.py
- change model type of
--torch
to--torchvision
- change model type to
--nni-ir
and--nnmeter-ir
- predict torch model by calling its name
- support batch mode for torch model
- add
--torchvision
test inintegration_test_torch.py
(intend to run parallel to save time) - refine readme.md
- change
torchvision
type totorch
considering thenn.Module
in python binding
- change model type of
- add a new usage
--getir
to get nn-meter ir model for tensorflow and onnx model-
--getir
usage testing - add
--getir
usage test inintegration_test.py
- edit README.md
-
- edit the API in NNI (remove default config)
- add nn-meter in related projects of nni
- change
'nni'
to'nni-ir'
- arrange docs
- public
- open a PR to refresh the data release link in config
- add hardware device attribution in
config/predictors.yaml
, and refine the hard code in predictor loading (category: cpu) - add cache in integration test
- split the integration test into 4
.yml
file
this item could be ignored if we support batch mode in command line and the integrated test takes an acceptable time consumption
here is a batch mode example
nn-meter --onnx --predictor <predictor-name> <file-or-folder>