xboot / libonnx

A lightweight, portable pure C99 onnx inference engine for embedded devices with hardware acceleration support.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Running test/model/test_mnist_8 issue

noomio opened this issue · comments

Hi,

When I run the test/model/test_mnist_8 once it works and I get a OKAY result.
I then re-run it and it FAILS.

Any suggestion why this might be and what to look for?

This is some debug output:

$> datatx
[datatx/private] [FAIL]

Input3: float32[1 x 1 x 28 x 28] = [...]
: float32[1 x 10] = [...]
a:Plus214_Output_0, b:

fabs=0.000122,p=975.670227,q=975.670105

fabs=0.000305,p=975.670227,q=975.670105

fabs=0.000488,p=975.670227,q=975.670105

fabs=0.000793,p=975.670227,q=975.670105

fabs=0.000122,p=975.670227,q=975.670105

fabs=0.000122,p=975.670227,q=975.670105

fabs=0.000244,p=975.670227,q=975.670105

fabs=0.000229,p=975.670227,q=975.670105

fabs=0.000793,p=975.670227,q=975.670105

fabs=0.000244,p=975.670227,q=975.670105

datatx/test_mnist_8 [OKAY]

$> datatx
[datatx/private] [FAIL]

Input3: float32[1 x 1 x 28 x 28] = [...]
: float32[1 x 10] = [...]
a:Plus214_Output_0, b:

!ONNX_TENSOR_TYPE_FLOAT32,fabs=1472.012207,p=-496.342163,q=975.670105

datatx/test_mnist_8 [FAIL]

Looks like it's the alignment with malloc.