Computer Vision SystemsEvaluation (mAP, IoU, Precision-Recall)Easy⏱️ ~2 min

What is Intersection over Union (IoU) and Why Does It Matter?

Definition
Intersection over Union (IoU) measures how well a predicted bounding box matches the ground truth box. It is the area of overlap divided by the area of union: IoU = (Area of Overlap) / (Area of Union). IoU ranges from 0 (no overlap) to 1 (perfect match).

WHY SIMPLE METRICS FAIL

You cannot evaluate detection by counting "correct" predictions alone. A box that overlaps 10% of a car is technically a detection, but useless. A box twice the size of the object is also a detection, but wasteful. IoU quantifies localization quality, not just presence.

IOU THRESHOLDS

A detection is considered "correct" if IoU exceeds a threshold. IoU=0.5: The PASCAL VOC standard. Lenient; allows boxes 50% off. IoU=0.75: Stricter; requires tight localization. IoU=0.5:0.95: The COCO standard averages across 10 thresholds (0.5, 0.55, ..., 0.95) to reward both detection and precise localization.

💡 Key Insight: COCO mAP (averaging 0.5:0.95) is roughly half of PASCAL mAP (0.5 only) for the same model. Always specify which protocol you are using.

CALCULATING IOU

Given two boxes with corners (x1, y1, x2, y2): find intersection area = max(0, min(x2a, x2b) - max(x1a, x1b)) × max(0, min(y2a, y2b) - max(y1a, y1b)). Union area = area_a + area_b - intersection. Computationally cheap: 10-20 operations per pair.

💡 Key Takeaways
IoU = overlap area / union area; ranges 0 (no overlap) to 1 (perfect match)
IoU thresholds: 0.5 (PASCAL, lenient), 0.75 (strict), 0.5:0.95 (COCO, comprehensive)
COCO mAP is roughly half of PASCAL mAP for the same model due to stricter thresholds
IoU computation is cheap: 10-20 operations per box pair
📌 Interview Tips
1When explaining IoU, draw two overlapping boxes and calculate overlap/union areas step by step
2Clarify the threshold difference: PASCAL 0.5 is lenient, COCO 0.5:0.95 rewards precise localization
3Mention that COCO mAP ≈ 0.5 × PASCAL mAP to avoid comparing apples to oranges
← Back to Evaluation (mAP, IoU, Precision-Recall) Overview