yfpeng / object_detection_metrics

Object Detection Metrics

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Build status Latest version on PyPI License Downloads Pythong version codecov Hits

This project was forked from rafaelpadilla/Object-Detection-Metrics.

Development of object_detection_metrics happens on GitHub: https://github.com/yfpeng/object_detection_metrics

The latest object_detection_metrics releases are available over pypi.

Getting started

Installing object_detection_metrics

$ pip install object_detection_metrics

Reading COCO file

from podm import coco_decoder
with open('tests/sample/groundtruths_coco.json') as fp:
    gold_dataset = coco_decoder.load_true_object_detection_dataset(fp)

PASCAL VOC Metrics

from podm import coco_decoder
from podm.metrics import get_pascal_voc_metrics, MetricPerClass, get_bounding_boxes

with open('tests/sample/groundtruths_coco.json') as fp:
    gold_dataset = coco_decoder.load_true_object_detection_dataset(fp)
with open('tests/sample/detections_coco.json') as fp:
    pred_dataset = coco_decoder.load_pred_object_detection_dataset(fp, gold_dataset)

gt_BoundingBoxes = get_bounding_boxes(gold_dataset)
pd_BoundingBoxes = get_bounding_boxes(pred_dataset)
results = get_pascal_voc_metrics(gt_BoundingBoxes, pd_BoundingBoxes, .5)

ap, precision, recall, tp, fp, etc

for cls, metric in results.items():
    label = metric.label
    print('ap', metric.ap)
    print('precision', metric.precision)
    print('interpolated_recall', metric.interpolated_recall)
    print('interpolated_precision', metric.interpolated_precision)
    print('tp', metric.tp)
    print('fp', metric.fp)
    print('num_groundtruth', metric.num_groundtruth)
    print('num_detection', metric.num_detection)

mAP

from podm.metrics import MetricPerClass
mAP = MetricPerClass.mAP(results)

IoU

from podm.box import Box, intersection_over_union

box1 = Box.of_box(0., 0., 10., 10.)
box2 = Box.of_box(1., 1., 11., 11.)
intersection_over_union(box1, box2)

Official COCO Eval

from pycocotools.coco import COCO
from pycocotools.cocoeval import COCOeval

coco_gld = COCO('tests/sample/groundtruths_coco.json')
coco_rst = coco_gld.loadRes('tests/sample/detections_coco.json')
cocoEval = COCOeval(coco_gld, coco_rst, iouType='bbox')
cocoEval.evaluate()
cocoEval.accumulate()
cocoEval.summarize()

Implemented metrics

Tutorial

  • Intersection Over Union (IOU)
  • TP and FP
    • True Positive (TP): IOU ≥ IOU threshold (default: 0.5)
    • False Positive (FP): IOU < IOU threshold (default: 0.5)
  • Precision and Recall
  • Average Precision
    • 11-point AP
    • all-point AP
  • Official COCO Eval

License

Copyright BioNLP Lab at Weill Cornell Medicine, 2022.

Distributed under the terms of the MIT license, this is free and open source software.

About

Object Detection Metrics

License:MIT License


Languages

Language:Python 100.0%