Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sonify #91

Merged
merged 4 commits into from Sep 2, 2016
Merged

Sonify #91

merged 4 commits into from Sep 2, 2016

Conversation

bmcfee
Copy link
Contributor

@bmcfee bmcfee commented Dec 13, 2015

Implements #82


This change is Reviewable

@bmcfee bmcfee self-assigned this Dec 13, 2015
@bmcfee bmcfee added this to the 0.2.2 milestone Dec 13, 2015
@bmcfee
Copy link
Contributor Author

bmcfee commented Dec 13, 2015

Note: pitch sonification is broken because mir_eval doesn't support overlapping intervals (or explicit durations).

@bmcfee
Copy link
Contributor Author

bmcfee commented Jan 30, 2016

Question for the crew: do we want to stall this until we can arrive at a good solution for pitch sonification? Or just merge as is?

If the latter, can someone please CR this? (Looking at you, @justinsalamon :))

It shouldn't take long, as it's mainly just glue code that maps namespaces to mir_eval functions.


freqs = 440.0 * (2.0 ** ((np.arange(128) - 69.0)/12.0))

gram = np.zeros((len(freqs), len(notes)))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this assumes non-overlapping notes, will break for overlapping notes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

!resolved

@justinsalamon
Copy link
Contributor

@bmcfee I wouldn't merge as is. If we remove the pitch_midi method and the mapping from SONIFY_MAPPING we could merge the rest, which seems fine to me.

@justinsalamon
Copy link
Contributor

Based on how the time_frequency sonification method is written in mir_eval, a solution for sonifying pitch_midi would be to generate a gram with a higher temporal resolution and fill in activations for each note based its start time and duration. Could perhaps try and avoid unnecessary sparsity by rounding all start/end times to match a manageable hop size (e.g. 10ms?)

@bmcfee
Copy link
Contributor Author

bmcfee commented Feb 3, 2016

@bmcfee I wouldn't merge as is. If we remove the pitch_midi method and the mapping from SONIFY_MAPPING we could merge the rest, which seems fine to me.

That was my feeling as well.

Alternately, if someone wants to PR mir_eval to fix the pitch rendering, we could just stall merging this one until the next revision.

@craffel thoughts?

@craffel
Copy link

craffel commented Feb 3, 2016

What exactly is your gripe with time_frequency? Can you make a PR?

@bmcfee
Copy link
Contributor Author

bmcfee commented Feb 3, 2016

What exactly is your gripe with time_frequency? Can you make a PR?

Essentially that intervals are implicit, rather than explicit, so you can't easily specify duration.

I can do the PR if nobody else wants to.

@craffel
Copy link

craffel commented Feb 3, 2016

I'm not sure what you mean, so maybe it'd be better if you did it.

@justinsalamon
Copy link
Contributor

Am I wrong in understanding that time_frequency in mir_eval was written with spectrograms in mind, not symbolic representations? Perhaps it's just not the right target method for sonifying MIDI-style note sequences?

@craffel
Copy link

craffel commented Feb 3, 2016

Am I wrong in understanding that time_frequency in mir_eval was written with spectrograms in mind, not symbolic representations? Perhaps it's just not the right target method for sonifying MIDI-style note sequences?

It should work fine either way.

@bmcfee
Copy link
Contributor Author

bmcfee commented Feb 3, 2016

I'm not sure what you mean, so maybe it'd be better if you did it.

ok.

Am I wrong in understanding that time_frequency in mir_eval was written with spectrograms in mind, not symbolic representations? Perhaps it's just not the right target method for sonifying MIDI-style note sequences?

I think it's more that time_frequency is meant for sonifying a single pitch trajectory, since the duration of one frequency is implicitly coded by the start time of the next.

It's pretty trivial to fix this so that it accepts a [start, end] array instead of onset times, and does the right thing with overlap and add.

@craffel
Copy link

craffel commented Feb 3, 2016

It's pretty trivial to fix this so that it accepts a [start, end] array instead of onset times, and does the right thing with overlap and add.

Can you do it without breaking current functionality? We can discuss in the hypothetical PR.

@bmcfee
Copy link
Contributor Author

bmcfee commented Feb 11, 2016

Tests are done and this should be ready to merge as soon as the next mir_eval release is out.

@bmcfee
Copy link
Contributor Author

bmcfee commented Mar 1, 2016

@justinsalamon care to sign off on this one as well?

@bmcfee
Copy link
Contributor Author

bmcfee commented Apr 21, 2016

@justinsalamon I'll trade you a mir_eval CR for a jams CR here....

@justinsalamon
Copy link
Contributor

@bmcfee sounds like a fair trade :) I'll give it a look this afternoon

**kwargs)


def pitch_hz(annotation, sr=22050, length=None, **kwargs):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The following code causes will cause a crash, though I think the problem is with the mir_eval code (this line), not the jams wrapper.

jam = jams.load('/Users/justin/dev/jams-data/datasets/MIREX05/train05REF.jams')
ann = jam.annotations[0]
jams.sonify.pitch_hz(ann)

There's a division by the frequency value, but since the convention is to indicate silence (no pitch) by non-positive value, it breaks. Error:

---------------------------------------------------------------------------
ZeroDivisionError                         Traceback (most recent call last)
<ipython-input-9-ceb7518fe118> in <module>()
----> 1 jams.sonify.pitch_hz(ann)

/Users/justin/Documents/dev/jams/jams/sonify.pyc in pitch_hz(annotation, sr, length, **kwargs)
     61                          gram, freqs, intervals,
     62                          fs=sr, length=length,
---> 63                          **kwargs)
     64 
     65 

/Users/justin/Documents/dev/mir_eval/mir_eval/util.pyc in filter_kwargs(_function, *args, **kwargs)
    741             filtered_kwargs[kwarg] = value
    742     # Call the function with the supplied args and the filtered kwarg dict
--> 743     return _function(*args, **filtered_kwargs)
    744 
    745 

/Users/justin/Documents/dev/mir_eval/mir_eval/sonify.pyc in time_frequency(gram, frequencies, times, fs, function, length)
    127     for n, frequency in enumerate(frequencies):
    128         # Get a waveform of length samples at this frequency
--> 129         wave = _fast_synthesize(frequency)
    130         # Scale each time interval by the piano roll magnitude
    131         for m, (start, end) in enumerate((times * fs).astype(int)):

/Users/justin/Documents/dev/mir_eval/mir_eval/sonify.pyc in _fast_synthesize(frequency)
    107 
    108         # Generate ten periods at this frequency
--> 109         n_samples = int(10.0 * fs / frequency)
    110 
    111         short_signal = function(2.0 * np.pi * np.arange(n_samples) *

ZeroDivisionError: float division by zero

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ugh.

I guess this should be fixed within jams, not mir_eval, since mir_eval doesn't claim to support negative/zero-f inputs.

Since I don't have your data handy, can you paste a from-scratch example that triggers this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess this should be fixed within jams, not mir_eval, since mir_eval doesn't claim to support negative/zero-f inputs.

By "support" do you mean in general or the sonify module? mir_eval does support negative/zero frequency input for evaluation (e.g. the melody module).

Since I don't have your data handy, can you paste a from-scratch example that triggers this?

You do, it's in the jams-data repo ;)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By "support" do you mean in general or the sonify module?

In sonify.

You do, it's in the jams-data repo ;)

What I meant was: I'm lazy and don't want to clone an entire repo just to cook up what should be a one-liner test case. 😞

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@justinsalamon
Copy link
Contributor

I'd vote for pitch_* and notes_* then. I think that's pretty straightforward without being terribly confusing.

No no, you can't have pitch and note as two separate things... a note is a representation of pitch! The options I'm proposing are either note_* and contour_* or pitch_note_* and pitch_contour_*.

I'm fine with deprecating things, but we need to give folks a runway to transition their data within a functioning environment. My usual policy is to deprecate features within minor revisions -- with warnings to that effect -- and remove them across major revisions.

That sounds fine to me, as long as we eventually kill pitch_hz altogether :)

@bmcfee
Copy link
Contributor Author

bmcfee commented May 25, 2016

Circling back on this, since #111 suggests that we're going to deprecate pitch_* in favor of more specific namespaces.

I think the piano roll sonification that we currently have will work fine for note_* annotations, but not contour_* annotations. As @justinsalamon mentioned previously, we could roll our own sonifier for continuous pitch annotations (and/or steal from melosynth). The question I have is: does this constitute bloat? I suspect it might make better sense to PR melosynth into mir_eval, assuming that @craffel would find it useful.

Implementing it within jams seems a little to special-case to me, and borders on bloat. What do y'all think @justinsalamon ?

@justinsalamon
Copy link
Contributor

justinsalamon commented May 25, 2016

I guess it a philosophical question: where should sonification and visualization code live - mir_eval or JAMS?

If you ask me I actually think both should be part of JAMS, since it's designed for annotation generation/manipulation/exploration, whereas mir_eval is focused on annotation comparison. Of course one could argue that evaluation by audition/visualization is relevant to mir_eval. I can also imagine other valid arguments for including this functionality in mir_eval, an obvious one being that if everything lives in JAMS people using mir_eval with different annotation formats can't take advantage of these tools.

For historical reasons right now visualization lives inside JAMS and sonification lives inside mir_eval, though I think that's pretty arbitrary.

@craffel
Copy link

craffel commented May 25, 2016

Of course one could argue that evaluation by audition/visualization is relevant to mir_eval.

That's why it was added.

I can also imagine other valid arguments for including this functionality in mir_eval, an obvious one being that if everything lives in JAMS people using mir_eval with different annotation formats can't take advantage of these tools.

This is important. Whatever the visualization/sonification tool is, it should (IMO) be storage format-independent.

For historical reasons right now visualization lives inside JAMS and sonification lives inside mir_eval, though I think that's pretty arbitrary.

Visualization was always intended to be part of mir_eval, but no one has gotten around to adding it.

@justinsalamon
Copy link
Contributor

Visualization was always intended to be part of mir_eval, but no one has gotten around to adding it.

Yes, but that was before JAMS was a thing (I think). Now you have 2 separate libraries, one for annotation wrangling and another for annotation comparison. Both visualization and sonification fit naturally into either.

Come to think of it I'm actually not sure whether viz was always part of the plan for mir_eval: I actually wrote basic viz for melody extraction eval into the original mir_eval melody module but it was canned.

@craffel
Copy link

craffel commented May 25, 2016

It's been part of the plan for 2 years, at least: craffel/mir_eval#28

@justinsalamon
Copy link
Contributor

It's been part of the plan for 2 years, at least: craffel/mir_eval#28

Haha ok cool. Still, we need to reach consensus as to where sonification and visualization go. I don't think it makes sense to have either spread across both libraries.

@bmcfee
Copy link
Contributor Author

bmcfee commented May 25, 2016

Still, we need to reach consensus as to where sonification and visualization go. I don't think it makes sense to have either spread across both libraries.

In both cases, I think it makes sense to have core routines in mir_eval that do the heavy lifting, and wrappers in jams that handle conversion and maybe metadata decoration (in the case of viz).

For the purposes of this discussion though, @craffel : how do you feel about adding a continuous pitch sonification routine alongside piano roll and clicks? It would make our lives easier over here.

@craffel
Copy link

craffel commented May 25, 2016

Haha ok cool. Still, we need to reach consensus as to where sonification and visualization go. I don't think it makes sense to have either spread across both libraries.

Well, like I said, as long as the visualizers are not tied to a specific data storage format, it's fine either way.

In both cases, I think it makes sense to have core routines in mir_eval that do the heavy lifting, and wrappers in jams that handle conversion and maybe metadata decoration (in the case of viz).

This seems to make sense to me.

For the purposes of this discussion though, @craffel : how do you feel about adding a continuous pitch sonification routine alongside piano roll and clicks? It would make our lives easier over here.

Sure, that would be great to have. Realistically I won't do it anytime in the foreseeable future.

@bmcfee
Copy link
Contributor Author

bmcfee commented May 25, 2016

Realistically I won't do it anytime in the foreseeable future.

understood; I'll try to find time for it soonish.

@bmcfee
Copy link
Contributor Author

bmcfee commented Jun 8, 2016

Note to self: this PR is stalled by the following:

  • pitch namespace revisions
  • mir_eval 0.4 release

@bmcfee bmcfee changed the title Sonify [WIP] Sonify Jun 8, 2016
@bmcfee
Copy link
Contributor Author

bmcfee commented Jul 19, 2016

Holding off on moving this guy forward until after display merges. There's redundant changes there for ns conversion on the new pitch namespaces that we'll need here.

@bmcfee
Copy link
Contributor Author

bmcfee commented Aug 29, 2016

Okay, I think this one's good to go now.

NB: the semantics for pitch are only, say, 99% obvious.

We support two styles of pitch sonification: piano-roll and contour. Piano roll has higher priority, so if you have an annotation in the old pitch contour namespace(s), it will be sonified by piano roll. This is because all pitch annotations can be coerced into a contour set, but not all can be coerced into a piano roll (eg, note_hz).

Practically, this means that only annotations of type pitch_contour will be sonified using contours. I think this is the expected behavior, but I wanted to make it 100% clear in the comments here.

@bmcfee bmcfee changed the title [WIP] Sonify Sonify Aug 29, 2016
@bmcfee
Copy link
Contributor Author

bmcfee commented Aug 30, 2016

Reviewed 1 of 4 files at r6.
Review status: 1 of 5 files reviewed at latest revision, all discussions resolved.


Comments from Reviewable

@bmcfee
Copy link
Contributor Author

bmcfee commented Aug 30, 2016

This is basically done, modulo two things:

  • beat_position -> beats+downbeat sonification with different click frequencies
  • multi_segment -> similarly

We could do this, but because of how mir_eval.sonify.clicks works, we'd have to synthesize our own click samples for different frequencies. Is this worth the effort?

@bmcfee
Copy link
Contributor Author

bmcfee commented Aug 30, 2016

downbeats (beat_position) and multi_segment are now in.

I think this covers everything except for the ones that don't make sense to sonify: key, tag_*, tempo, etc.

Anyone care for a final CR?

updated sonify module

added pitch

use mir_eval kwarg filtering on sonify functions

added pitch_hz sonifier

adding unit tests for sonify

added ns-specific tests for sonify
fixing bugs in sonification

orderddict for sonify_mapping, handle non-positive pitches

added empty signal handling in pitch_hz

simplified pitch_hz logic

improved test coverage on pitch_hz sonify

fixed instantaneous pitch contour generation

fixed a broken test case in pitch_hz

pitch_hz sonification via mir_eval pitch_contour
fixed unit tests for piano roll sonification

added a contour sonification test

fixed coveralls

fixed multi-contour sonification

fixed a basis error in multi-contour plotting

added downbeat sonification

modified downbeat metronome pitches

fixed a length issue in pitch contour sonification

expanded contour test
@bmcfee
Copy link
Contributor Author

bmcfee commented Sep 2, 2016

Reviewed 1 of 2 files at r1, 1 of 4 files at r6, 2 of 2 files at r7.
Review status: all files reviewed at latest revision, all discussions resolved.


Comments from Reviewable

@bmcfee bmcfee merged commit 0868430 into master Sep 2, 2016
@bmcfee bmcfee deleted the sonify branch September 2, 2016 15:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants