onnx / onnx

Open standard for machine learning interoperability

Home Page:https://onnx.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Call for Contributions

houseroad opened this issue · comments

Call for Contributions

ONNX is open source, and contributions from the community is welcome. Many tasks, (some are low-hanging fruits), are listed here. If you would like to contribute to the project, but don't know where to start, this list is worth to check.

Adding ONNX Backend Test Case

Ideally, we should have sample cases for every operator, which can generate backend test data and also serve as examples. However, many operators have no sample cases, and even not covered by ONNX backend test cases. So please contribute your useful cases to ONNX repository.

How to contribute ONNX backend test?

The operators which don't have examples:

  • Abs
  • Add
  • And
  • ArgMax
  • ArgMin
  • AveragePool
  • BatchNormalization
  • Cast
  • Ceil
  • Clip
  • Concat
  • Conv
  • ConvTranspose
  • DepthToSpace
  • Div
  • Dropout
  • Elu
  • Equal
  • Exp
  • Flatten
  • Floor
  • GRU
  • Gather
  • Gemm
  • GlobalAveragePool
  • GlobalLpPool
  • GlobalMaxPool
  • Greater
  • HardSigmoid
  • Hardmax
  • InstanceNormalization
  • LRN
  • LSTM
  • LeakyRelu
  • Less
  • Log
  • LogSoftmax
  • LpNormalization
  • LpPool
  • Max
  • MaxPool
  • MaxRoiPool
  • Mean
  • Min
  • Mul
  • Neg
  • Not
  • Or
  • PRelu
  • Pow
  • RNN
  • RandomNormal
  • RandomNormalLike
  • RandomUniform
  • RandomUniformLike
  • Reciprocal
  • ReduceL1
  • ReduceL2
  • ReduceLogSum
  • ReduceLogSumExp
  • ReduceMax
  • ReduceMean
  • ReduceMin
  • ReduceProd
  • ReduceSum
  • ReduceSumSquare
  • Reshape
  • Selu
  • Sigmoid
  • Softmax
  • Softplus
  • Softsign
  • SpaceToDepth
  • Split
  • Sqrt
  • Squeeze
  • Sub
  • Sum
  • Tanh
  • Tile
  • TopK
  • Transpose
  • Unsqueeze
  • Xor
  • experimental ATen
  • experimental Affine
  • experimental ConstantFill
  • experimental Crop
  • experimental FC
  • experimental GRUUnit
  • experimental GivenTensorFill
  • experimental Identity
  • experimental If
  • experimental ImageScaler
  • experimental Loop
  • experimental LoopIndexTensor
  • experimental MeanVarianceNormalization
  • experimental ParametricSoftplus
  • experimental Scale
  • experimental ScaledTanh
  • experimental ThresholdedRelu
  • experimental Upsample

Adding Export Support in PyTorch

Some simple operators are not supported by PyTorch exporter yet, call for contributions.

How to add ONNX support in PyTorch

How to generate ONNX backend test data in PyTorch operator export test

Here is a TO-DO list:

  • Tile (Repeat in PyTorch)

@guoyuhong, thanks for contributing the node test of and, or, xor, and not!

To avoid duplicate work, ReduceMax, ReduceMin, ReduceSum, Conv, Flatten, and generator OPs are being taken care of by some folks now.

I am adding tests for some logical and math operators:

  • Equal, Greater, Less
  • Ceil, Div, Floor, LeakyRelu, Mul, Neg, Reciprocal, Selu, Sqrt, Sub

Working on ArgMax, and ArgMin. Will try to complete the list as I understand the source.

I was reading through the source of the backend stub that is test_backend_test.py, It's only checking the proto format. How can I setup the full ONNX reference implementation for the Ops? Do I need to set it up with a supported backend? (caffe2/pytorch)

@kevin645 Thanks a lot! I marked ArgMax and ArgMin as you picked them up. After you finish the case test, you can run cmd_tools.py to generate the data for backend test. The test data will generated in onnx/backend/test/data folder. test_backend_test.py only runs the checker on the test data. To have a full reference execution on the generated data, you need to install a backend, such as caffe2 with onnx-caffe2, or cntk. (Right now, PyTorch can only export the model, there is no importer yet.)

Cast Reshape Concat Split Transpose will be covered, @houseroad please mark them to avoid duplication.

#472 and #464 are covering math ops btw.

Gather added in #537 thanks to @huitseeker

@houseroad I am putting together an R package for ONNX here. Would you like the ownership get transferred to ONNX Github organization? It would help with the exposure of the package later on.

Edit: let's discuss this in #581.

Has the tests for the following operators - ReduceMax, ReduceMin, ReduceSum, Conv, Flatten, been merged? If no one is doing ReduceMean and ReduceProd, I can take it up. Please confirm.

As far as I know, no one is working on ReduceMax, ReduceMin, ReduceSum, ReduceMean, ReduceProd yet. Feel free to take them. @anirudhacharya :-)

@houseroad This comment - #426 (comment) says that someone is working on ReduceMax, ReduceMin and ReduceSum. And the checklist in the description section of this Issue has these three operators marked as done.

I will work on the ReduceProd and ReduceMean operator for now.

@anirudhacharya sounds good to me. Thanks!

@onnx/facebook-onnx-contributors Can we revisit this list and verify if it is up to date? I have made a few updates but it might still be a bit out of sync.
Also when do we tick off an operator in the list -

  • When someone starts working on it or when the PR gets merged?
  • If a particular operator has tests in pytorch-converted/pytorch-operator but does not have node tests do we consider it as full test coverage?

A1: When someone starts working on it, we will mark the op

A2: No, we still need node test, which serves as doc purpose.

Then what is the purpose of pytorch-converted/pytorch-operator tests?

Providing more cases to check the backend's correctness. :-)

I am working on BatchNormalization, GEMM, and LRN operator tests.

@houseroad Suggest we close this old issue and open a new one if needed with what remains

@prasanthpul okay, close it