experiencor / keras-yolo3

Training and Detecting Objects with YOLO3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Question about mAP calculation

brian123214 opened this issue · comments

In utils.utils, the function evaluate looks like this:

def evaluate(model, generator, iou_threshold=0.5, obj_thresh=0.5, nms_thresh=0.45, net_h=416, net_w=416, save_path=None)
My questions is, is this calculating AP at IOU at 0.5 or AP as how it is calculated in the COCO dataset, which is slightly different?
https://jonathan-hui.medium.com/map-mean-average-precision-for-object-detection-45c121a31173
How would you implement the other AP calculation that is not used in this repo?