convolution with nchw format
fwz-fpga opened this issue · comments
fwz-fpga commented
XNNPACK/src/operators/convolution-nchw.c
Lines 166 to 172 in 0fd983b
Does it mean that xnnpack support convolution with nchw format with these cases? Will xnnpack support more nchw format optimized kernels?
Onnx model export from pytorch default with nchw format, so if use xnnpack to run onnx model , it will do lots' of extra things(code) with format change. Such as concat\slice\gather..
Any suggestions?
Marat Dukhan commented
XNNPACK supports only a very limited set of NCHW operators for sparse inference. See here for details. There are no plans for extending NCHW operator support.