पता लगाने के लिए, यह निर्धारित करने का एक सामान्य तरीका है कि एक वस्तु प्रस्ताव सही था , संघ पर अंतरिमता (IoU, IU)। यह प्रस्तावित ऑब्जेक्ट पिक्सल के सेट और सच्चे ऑब्जेक्ट पिक्सेल B के सेट को लेता है और गणना करता है:
Commonly, IoU > 0.5 means that it was a hit, otherwise it was a fail. For each class, one can calculate the
- True Positive (): a proposal was made for class and there actually was an object of class
- False Positive (): a proposal was made for class , but there is no object of class
- Average Precision for class :
The mAP (mean average precision) =
If one wants better proposals, one does increase the IoU from 0.5 to a higher value (up to 1.0 which would be perfect). One can denote this with mAP@p, where is the IoU.
But what does mAP@[.5:.95]
(as found in this paper) mean?
[.5:.95]
part refers to a range of IoU values, but how that range is assessed into a single mAP I would not know.