Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chroma transcription metrics #197

Open
craffel opened this issue Jun 1, 2016 · 7 comments
Open

Chroma transcription metrics #197

craffel opened this issue Jun 1, 2016 · 7 comments
Milestone

Comments

@craffel
Copy link
Owner

craffel commented Jun 1, 2016

To cover all evaluation done in MIREX, we also need to add the ability to evaluate transcription annotations after mapping the pitch values to a chroma (single octave) scale, as discussed in #180.

@craffel craffel added this to the 0.4 milestone Jun 1, 2016
@craffel craffel mentioned this issue Jul 28, 2016
@craffel
Copy link
Owner Author

craffel commented Aug 19, 2016

Punting to 0.5.

@craffel craffel modified the milestones: 0.5, 0.4 Aug 19, 2016
@chf2117
Copy link

chf2117 commented Jan 6, 2017

I'd like to take a stab at this. Seems like the way to do this is add a flag in precision_recall_f1_overlap and match_notes?

@ejhumphrey
Copy link
Collaborator

not sure if it'll be of any help, but there's machinery in mir_eval.chord that might be useful??

@craffel
Copy link
Owner Author

craffel commented Jan 6, 2017

I'd like to take a stab at this.

Cool, contributions welcome. @justinsalamon will know best what is necessary.

Seems like the way to do this is add a flag in precision_recall_f1_overlap and match_notes?

Seems that way to me too.

@justinsalamon
Copy link
Collaborator

This should be relatively straight forward. The metrics (excluding ones that only consider onsets or offsets) rely on match_notes, which computes pitch distances here and checks for pitch matches here. You'd have to add a flag which, if set, calls an alternative version of these lines which checks for matches in (cyclic) chroma space.

@chf2117
Copy link

chf2117 commented Jan 7, 2017

I've got a 3 line solution that makes sense to me here. Still need to test.

Don't know a good way to generate test data. Any advice @justinsalamon?

@justinsalamon
Copy link
Collaborator

@chf2117 that looks right to me (but should be tested of course). To test that it really does what you expect it to you need to add unit tests to test_transcrpition.py. The toy data for the unit tests is hard coded here, if it's sufficient for covering all the chroma cases great, if not you should create more hard coded data (but don't add to / change the existing data as that will break the existing tests!).

Data for regression tests lives here, no need to touch est* and ref* but you'll have to update the output* files.

Finally, to check your output against the MIREX results see this comment in the original transcription pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants