Inference time increases with custom model training.
asguradian opened this issue · comments
Compare to base model that i use for transfer learning, the model i extracted increases inference time after custom object training. is it th normal behaviour
Hi @asguradian, this repo relies on a dated version of the TensorFlow api. We've moved to a more future proof version here: https://github.com/cloud-annotations/training
I encourage you to try it out and reopen this issue there if you are still running into problems