Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

a question about score value in PredictionsTextFile export #56

Closed
fcakyon opened this issue Jun 9, 2021 · 7 comments
Closed

a question about score value in PredictionsTextFile export #56

fcakyon opened this issue Jun 9, 2021 · 7 comments
Labels
help wanted Extra attention is needed

Comments

@fcakyon
Copy link
Contributor

fcakyon commented Jun 9, 2021

Thanks for this great library 馃憤

I'm wondering why is here the prediction score of the detection is exported as -1?

+ ",-1,-1,-1,-1"

@joaqo
Copy link
Collaborator

joaqo commented Jun 10, 2021

Thanks! @aguscas you may have some insight on this.

@dekked dekked added the help wanted Extra attention is needed label Jun 10, 2021
@aguscas
Copy link
Collaborator

aguscas commented Jun 10, 2021

Hello there! We are not using the confidence score of the detections during the evaluations. If you set that score to be anything greater or equal to -1, the evaluation will return the same results. If you set it to something less than -1, that specific prediction will be ignored during the evaluation. What I mean is that, it doesn't really matter to which number you set that confidence value, as long as it is greater or equal to -1

@fcakyon
Copy link
Contributor Author

fcakyon commented Jun 10, 2021

@aguscas thanks for your response!

In this comment you mentioned that when score is 0, that detection will be ignored during evaluation: #42 (comment).

That got me a bit confused. Are detections with score of 0 ignored or only detections with -1 are being ignored?

@aguscas
Copy link
Collaborator

aguscas commented Jun 11, 2021

If you set to 0 the confidence in your ground truth files, then that object will be ignored. So there are 2 confidence values, one thing is the confidence value in your trackers (which we set to -1), and another thing is the confidence value in the ground truth labels (which is set to 1 when you want to take that object into account in the evaluation, or 0 otherwise).

@fcakyon
Copy link
Contributor Author

fcakyon commented Jun 11, 2021

@aguscas thanks a lot, that clears all things up!

One last question. Are there any more differences between tracker output and ground truth or is this all i need to know while arranging ground truths and tracker outputs?

@aguscas
Copy link
Collaborator

aguscas commented Jun 11, 2021

I think everything else is exactly the same

@dekked
Copy link
Member

dekked commented Jun 11, 2021

Thanks @aguscas for the help, we definitely need to do a better job at documenting some parts of this :)

@dekked dekked closed this as completed Jun 11, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants