Skip to content
This repository has been archived by the owner on May 27, 2024. It is now read-only.

How do you calculate Mean average precision? #21

Closed
malarsaravanan1991 opened this issue Jun 11, 2020 · 1 comment
Closed

How do you calculate Mean average precision? #21

malarsaravanan1991 opened this issue Jun 11, 2020 · 1 comment

Comments

@malarsaravanan1991
Copy link

Hey hi,
I have a question regarding the evaluation metric.
How do you calculate mean average precision for activity prediction? Is it same as pascal VOC mAP
evaluation metric or some other technique?
I see in the code that score and labels are only considered. What about the bounding boxes?

@JunweiLiang
Copy link
Contributor

The activity prediction mAP is computed over each person. There is no bounding box prediction. Given the observations of a person, the model outputs a multi-class probability of all actions, which is used to compare with the ground truth. See problem formulation in Section 3 of the paper. mAP is originally proposed in the Information Retrieval field. See the definition here.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants