Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scroll a view into certain position given as Measure/Beat #166

Closed
cecilios opened this issue Jun 7, 2018 · 26 comments · Fixed by #174
Closed

Scroll a view into certain position given as Measure/Beat #166

cecilios opened this issue Jun 7, 2018 · 26 comments · Fixed by #174

Comments

@cecilios
Copy link
Member

cecilios commented Jun 7, 2018

Current API does not permit to scroll a view into a certain position given as Measure/Beat information. The need for this feature was identified in issue #163.

As lomse documents can contain not only just one score but full documents with texts, lists, images, many scores, etc., this feature will be restricted for documents containing only one score, such as MusicXML imported files.

@cecilios
Copy link
Member Author

@hugbug - I have started to design the changes for adding this functionality.

My initial idea was to add a new method Interactor::scroll_to_measure(int measure, int beat=1), so scrolling to measure 15, beat 2 would be:
scroll_to_measure(15, 2).
But using 'beats' is more complex than other approaches for describing 'measure location', as 'beats' is not an absolute independent unit but depends on the applicable time signature.

There are a variety of situations, not only to display a visual cursor or for scrolling, in which the measure location of a musical event needs to be described, in terms of the content of the measure. The simpler and most convenient method for describing measure location is to express it as a time position, expressed as note durations after the start of its containing measure. For instance, it could be specified as a valid floating-point number that gives the number of whole notes from the start of the measure. Examples:

0.25 (=1/4) 	- one quarter note after the start of the containing measure
0.375 (=3/8)	- three eighth notes after the start of the containing measure

Another alternative would be to express location in lomse time units (TU). For instance, in Lomse the duration of a whole note is 256 TU. Therefore the above examples would be now:

64 (= 0.25 * 256)	- one quarter note after the start of the containing measure
96 (= 0.375 * 256)	- three eighth notes after the start of the containing measure

When using measure locators in time units as above described, the beats in a 4/4 time signature would be at locations 0, 64, 128 and 192. Expressing location in time units has the advantage of being a general measure locator for other purposes, allowing for instance to position a cursor at any note, not only on beat notes or to precisely position an internal cursor at any place in the measure. And for scrolling to a given location a new method Interactor::scroll_to_location(int measure, int location) would be used. So scrolling to measure 15, beat 2, assuming applicable time signature is 4/4, would be:
scroll_to location(15, 64)

But using beats as a locator would require to determine the applicable time signature for the desired measure and to convert 'beat' to 'location'. Although Lomse performs currently these computations, the complexity comes from the objects involved in doing these computations. In current Lomse code, the knowledge for determining beats is associated to the ScorePlayer and related objects. But these computations are only performed before playback, when the score is loaded in the ScorePlayer object. Adding a new method Interactor::scroll_to_measure(int measure, int beat=1) to be used when an external player is used, would imply that before using this new scroll method, all the necessary computations for converting beats to measure location must be done. An this requires to move the knowledge out from ScorePlayer and related objects to another place and to do always the beats computations. It is not complex, but requires a lot of refactoring. I don't mind to do it if it is really necessary but before proceeding I would like to know if for your application you really need to use beats or if you could manage using location in time units.

@hugbug
Copy link
Contributor

hugbug commented Jun 14, 2018

Thank you for the explanation.

Interactor::scroll_to_location(int measure, int location)

I like this idea.

In my app I'm completely dependent from the external hardware - Yamaha Digital Piano. It provides playback position information as Measure/Beat. I would need to know time signature of a specific measure to compute location. In my app I use MusicXML files as input. The files contain time signature for measures, which I should be able to use for computation. I need to parse MusicXML but at first sight that seems to be easy.

Or maybe I can use ScorePlayer as a tool to determine time signature for the desired measure?

I've attached an example MusicXML-file which has time signature changes in the middle. I'll be testing with it and you may find it useful too.

Scott.Joplin-The.Entertainer.(David.Bruce).xml.zip

@hugbug
Copy link
Contributor

hugbug commented Jun 14, 2018

Will scroll_to_location also highlight the position or will there be a second function to set highlight width?

In the following example (taken from Yamaha Smart Pianist App) two notes belonging to one beat are highlighted:
39547958-648f91de-4e59-11e8-95c5-41f11dca26c5

Would that be possible in Lomse too? Maybe function scroll_to_location needs third parameter to set highlight width?

Anyway that would be a luxus feature which I can live without.

@cecilios
Copy link
Member Author

My current idea is that all parameters (color, height and width) will be customizable.
Maximum height will be system height (this includes a lot of blank space at top and bottom of staves). You would customize this by specifying a percentage of maximum height.
For width, you would specify 'notes duration'. For instance, in a 4/4 time signature score, if you would like to highlight a full beat, you would specify 64 TU (a quarter note).

The above approach will be independent of the final approach to specify the cursor position (using beats or using time units). The approach using time units move_cursor_to(measure, location) will be implemented anyway. If beats are also required, a second method move_cursor_to(measure, beat) will be implemented but, internally, will convert beats to time units and invoke the first method. In this case, it also would be possible to specify the cursor width as beats (i.e. 1 beat, 1.5 beats, etc.)

As I write this, I realize that for cursor width my first idea of specifying it in time units is not good, as for highlighting beats the width (in time units) depends on the beat duration. If in the score there are time signature changes, highlighting beats will require to dynamically adapt cursor width to current time signature. So:

  • as you suggest, a third parameter to move cursor is needed, or
  • dynamically adapting cursor width requires that lomse maintains a beats table (in this case, implementing move_cursor_to(measure, beat) is not a problem).

Ok, thank you. Question answerd. I think it is better if lomse is able to deal with beats.
I will do this in two steps. First, I will implement scroll and cursor using location (time units). Once it is operational, I will refactor the knowledge to deal with beats, and will add the secondary methods using beats.

@cecilios
Copy link
Member Author

cecilios commented Jul 4, 2018

@juuk100 Latest PR fixes this. The API documentation has been updated and a brief text explaining how to use these new methods has been included. See https://lenmus.github.io/lomse/page-sound-generation.html#sound-generation-external-player

@hugbug
Copy link
Contributor

hugbug commented Jul 4, 2018

Thank you very much, great work!
I'm going to integrate this into my app in the next few days and will let you know how it worked for me.

@hugbug
Copy link
Contributor

hugbug commented Jul 5, 2018

I've integrated the new Lomse version into my app.

It works amazingly well!

And because you did all the heavy lifting by implementing native Measure/Beat support, all I needed was to add just a couple of lines to glue the things together.

(very happy)


So far I've discovered a small glitch. I'm testing with musicxml-file attached above. Most measures have 4/4 signature but there are few with 5/8 and 3/8.

Measure 21 has 3/8 signature. The tempo line isn't correct for beats 2 and 3.

NOTE: here measure and beat numbers are counted from 1.

Measure 21, Beat 1 - displayed correctly:

21 1

Measure 21, Beat 2 - incorrectly displayed at the beginning of measure 22:

21 2

Measure 21, Beat 3 - incorrectly displayed in the middle of measure 22:

21 3

Measure 22, Beat 1 (displayed correctly):

22 1


Tempo line color

To change tempo line color I needed to subclass GraphicView to access protected member m_pTempoLine. That feels hacky. Maybe a new method in SpInteractor similar to existing set_view_background would be more appropriate?

@cecilios
Copy link
Member Author

cecilios commented Jul 6, 2018

Measure 21, Beat 2 - incorrectly displayed at the beginning of measure 22:
Measure 21, Beat 3 - incorrectly displayed in the middle of measure 22:

I think the tempo line is positioned correctly. 3/8 time signature has only one beat per measure, so after measure 21 beat 1 the next beat is measure 22 beat 1. Therefore, location m21/b2 is three eighth notes after m21/b1 so it is equivalent to m22/b1 and location m21/b3 is six eighth notes after m21/b1 so it is equivalent to the place displayed on the image, in the middle of m22/b2.

Are locations m21/b2 and m21/b3 provided by Yamaha piano or are computed by your app? If it is Yamaha piano, I would report a bug to Yamaha. Is it possible for you to check the behaviour of Smart Pianist with this score?

Measure 22, Beat 1 (displayed correctly):

I think the tempo line should be on the note, not on the 4/4 time signature. This is caused by the bug addressed in issue #173. Hope to fix this issue soon.

As to how to customize visual effects, I totally agree with you that methods in Interactor must be provided. I'll do it.

@cecilios
Copy link
Member Author

cecilios commented Jul 6, 2018

Thinking on this, as the concept of beat is tricky, perhaps Yamaha piano is using as beat the bottom number in the time signature. So any x/8 time signature would use for beat an eighth note. The problem of this approach is that metronome beats will not coincide with Yamaha MIDI beats. Can you test the behaviour of Yamaha piano when the Yamaha piano metronome is switched on while playing back?

@hugbug
Copy link
Contributor

hugbug commented Jul 6, 2018

Thank you for looking into this.

Are locations m21/b2 and m21/b3 provided by Yamaha piano or are computed by your app? If it is Yamaha piano, I would report a bug to Yamaha.

The locations are reported by the piano. This happens in form of MIDI messages. These messages are nowhere documented. I had to reverse engineer them.


I've just made a test of this song in Smart Pianist app, in measures 20 and 21 (counting from 1).

Measure 20

Measure 20 has 5/8 signature and piano generates 5 positional events with Measure/Beat from 20/1 to 20/5. That's how Smart Pianist app displays the scores for these 5 positions:
screenshot 2018 07 06 15 17 49
screenshot 2018 07 06 15 17 55
screenshot 2018 07 06 15 17 59
screenshot 2018 07 06 15 18 15
screenshot 2018 07 06 15 18 19

Measure 21

Measure 21 has 3/8 signature and piano generates 3 positional events with Measure/Beat from 21/1 to 21/3. That's how Smart Pianist app displays the scores for these 3 positions:
screenshot 2018 07 06 15 06 40
screenshot 2018 07 06 15 07 13
screenshot 2018 07 06 15 07 44

Measure 22

Measure 22 has 4/4 signature and piano generates 4 positional events with Measure/Beat from 22/1 to 22/4. That's how Smart Pianist app displays the scores for these 3 positions:
(NOTE: Lomse displays tempo line at the same positions for this measure; no issues here. I'm posting these screens merely for comparison)
screenshot 2018 07 06 15 32 32
screenshot 2018 07 06 15 32 41
screenshot 2018 07 06 15 32 47
screenshot 2018 07 06 15 32 52


Can you test the behaviour of Yamaha piano when the Yamaha piano metronome is switched on while playing back?

The metronome clicks 5 times in measure 20 (5/8 signature), 3 times in measure 21 (3/8 signature) and 4 times in measure 22 (4/4 signature). So the number of clicks corresponds to the number of events (and the number of position changes on the score view in the app as the reaction of the app to piano events).

Would you say these are not proper beats? Do you think it's possible to support Yamaha-Beats in Lomse as well?

@cecilios
Copy link
Member Author

cecilios commented Jul 6, 2018

Thanks for the info. This reinforces my guess that Yamaha piano is using as beat the bottom number in the time signature.

I did a test with this score using MuseScore to see how they deal with the metronome, from measure 18 to measure 24, and they do as Yamaha piano beats. Metronome is clicking in every quarter note until measure 19; then, in measure 20, it clicks on every eighth note, and returns to click in every quarter note in measure 22. This is not a 'normal' metronome behaviour but, probably, it is the simplest way of dealing with rare time signatures such as 5/8. But I do not understand MuseScore metronome behaviour because when opening a 6/8 score I would expect it would click every three eighth notes or on every eight note, but ... it clicks on every quarter note! three clicks per measure!

It would be nice if you could do tests with scores using different time signatures to confirm that Yamaha piano is using as beat the bottom number in the time signature.

Would you say these are not proper beats?

The concept of beat is fuzzy and it depends on the purpose. For example, for playing, it depends on how the music is conducted or how you would like to practise (i.e. metronome click for quarter notes or for eighth notes, etc.); from a theoretical point of view, depends on time signature: 3/8 is just one beat per measure. There are 'tricky' time signatures, e.g. 5/8 can be counted as 5 eights or as alternating sub-measures of 2/8+3/8 or 3/8+2/8 (two beats per measure). It is difficult to define what is a beat because there are many definitions, depending on the purpose.

Do you think it's possible to support Yamaha-Beats in Lomse as well?

Yes. But before proceeding it would be nice if you could do tests with scores using different time signatures to confirm that Yamaha piano is using as beat the note implied by the bottom number in the time signature. There is also the problem of music without time signature (i.e. some very commonly studied Erick Satie piano pieces).

So, the idea would be to allow the user application to define what is a beat before using methods requiring measure/beat parameters or to add an additional parameter to these methods. I devise, for now, the following options:

  • the note implied by the bottom number of the time signature (Yamaha-beats)
  • the beat implied by the time signature (current implementation)
  • a note value (i.e a quarter note, a half note, a dotted quarter note, etc.) (useful for scores without time signature, or for practising)

Other ideas, suggestions?

@hugbug
Copy link
Contributor

hugbug commented Jul 6, 2018

I'll gladly do the test on the piano.
I guess I need to compose in MuseScore a test song. Which signatures should I test?

If you happen to have a set of test files (musicxml or midi or anything MuseScore can open) I could test them instead.

@cecilios
Copy link
Member Author

cecilios commented Jul 6, 2018

Perhaps some scores from the Lilypond test set would be enough

TimeSignatures.zip

@hugbug
Copy link
Contributor

hugbug commented Jul 6, 2018

The results for file 11a-TimeSignatures on Yamaha piano:

Signature Metronome clicks
4/4 4
2/2 2
3/2 3
2/4 2
3/4 3
4/4 4
5/4 5
3/8 3
6/8 6
12/8 12

The number of metronome clicks is equal to number of position events and to number of cursor movements in Smart Pianist.

@cecilios
Copy link
Member Author

cecilios commented Jul 8, 2018

Thanks. This supports my guess that Yamaha Piano is using as beat the note implied by the bottom number in the time signature.
Could you please also tests the other scores I sent you? There are a few special cases a would like to know how Yamaha behaves. Thank you!

@hugbug
Copy link
Contributor

hugbug commented Jul 8, 2018

There is one issue which makes such tests with the piano not fully reliable.

Smart Pianist can read only midi-files. It then sends the midi-files as is to piano. MIDI-format of course doesn't have enough information to display scores properly, in particular it doesn't have measures nor signatures.

Once a song (midi-file) is selected in Smart Pianist it sends the song to piano and then the app shows the score. For this the app has to transform midi to score. This can't be made with 100% certainty because some guess work is needed.

To make life even more complex the piano does its own job to decode midi-file and to guess time signatures. We can only hope that the piano and the app both use the same algorithm to decode midi-files but we don't know that for sure.

Therefore the time signatures displayed in Smart Pianist are not necessary the same signatures which were in initial musicxml-document (before it was exported to midi in MuseScore). In fact if I import the midi back in MuseScore it sometimes shows different signatures. This especially happens for test files where first measure isn't full.

Therefore it seems it wasn't the best idea to use midi-files as interchange format between app and piano. On the other side both the piano (all digital pianos actually) and the app are not marketed for professionals but merely for hobby pianist and for home use. With classic repertoire the app and piano do pretty good job actually.

So please keep all this in mind when reading the following test results.

11c-TimeSignatures-CompoundSimple

Signature in musicxml Signature shown in Smart Pianist Metronome clicks
(3+2)/8 5/8 5
(5+3+1)/4 5/8 9

So it seems although the app didn't show time signature 9/4 for the second measure the piano has detected it correctly and the metronome clicked 9 times. The app also moved playback cursor 9 times as well. However it didn't show a signature for the second measure as if it were having the same signature as the first measure.

11d-TimeSignatures-CompoundMultiple

Signature in musicxml Signature shown in Smart Pianist Metronome clicks
3/4 11/8 11
1/8 11/8 21

Midi-file imported back in MuseScore shows time signature 21/8.

11e-TimeSignatures-CompoundMixed

Signature in musicxml Signature shown in Smart Pianist Metronome clicks
3/4 11/8 2+11

Metronome bell clicked after second beat click. It seems the piano detected two measures: first incomplete measure with two clicks and then a complete measure with 11 clicks.

11b-TimeSignatures-NoTime

Signature in musicxml Signature shown in Smart Pianist Metronome clicks
no signature 1/1 2+1

Similar to previous one.


I think you shouldn't pay much attention to these special cases as we can't tell for sure what signatures piano were dealing with in the first place.

Maybe the app and piano would detect signatures better if they were feed with a proper song instead of test file containing only one or two measures. It depends on how good MIDI deciphering is.

@cecilios
Copy link
Member Author

cecilios commented Jul 8, 2018

Very illustrative and interesting results! Thank you.

MIDI conversion could be a problem but, as you said, probably not for the most common scores for hobby pianists. I now do not understand clearly how your application works, as it seems that two files are needed, the MusicXML and the Midi. Does your application convert MusicXML to a Midi file and send the Midi file to the piano? Or it is the user who generates and deals with the two files, loading midi in the piano and MusicXml in your app?

Anyway, to try to improve Lomse support, I would appreciate that, if feasible and not complex, perhaps you could check if Yamaha piano is sending MIDI Time Signature meta messages. See http://www.recordingblogs.com/wiki/midi-time-signature-meta-message

@hugbug
Copy link
Contributor

hugbug commented Jul 8, 2018

In my app user selects a midi-file and the app sends it to piano. Then the app looks for the file with the same filename but with .musicxml or .xml extension and loads it into Lomse score viewer. The user is responsible for providing both files (.midi and .muscixml).

Initially I wanted to require midi-files only (similar to Smart Pianist) and I needed a way to render scores from midi-files. However it seems to be a very difficult job to decipher midi-files into scores. I've found MuseScore midi-importer to produce results very similar to Smart Pianist. I thought about reusing MuseScore' midi-importer in my app but it is a large module and heavily depends on Qt which I don't use in my app.

Currently I'm planning to stick with current approach (user has to provide both files) as it allows me to focus on app related things. Writing a midi-importer (even reusing the one from MuseScore) is a project on it's own.


Very interesting about time signature event. There is a way to embed time signature into midi-file after all!

I wonder if piano would use that information if it would be provided in the midi-file. I'll construct a midi-file with signatures and will test how piano reacts.

I did a lot of analysis of midi events sent by piano but can't remember this particular event. I'll test again to make sure about this and write back.

@cecilios
Copy link
Member Author

cecilios commented Jul 8, 2018

I did a lot of analysis of midi events sent by piano.... I'll test again to make sure about this and write back.

Thank you. If you discover that piano is sending time signature events, then the definition of beat will be there and you could inform lomse about a beat definition change. This will solve the problem and will solve all time signature cases!

Currently I'm planning to stick with current approach (user has to provide both files) as it allows me to focus on app related things.

Another alternative would be that in your application the user selects a MusicXML file, and then your app will generate the midi file and send it to the piano. This approach will give you full control of beats definition for time signatures. And conversion from MusicXML to midi file could be done by lomse. It should not be difficult as midi events are currently generated. It will be just writing the events table to a file in the appropriate format as well as some additional information. This is pure C++ and something I had in my TODO list, but it is very low priority as there are more important and urgent tasks for me. Perhaps you would be interesting in contributing with this; I will help you. Probably he main job is to study and understand the midi file format. The lomse MIDI exporter could be something to do in future, for version 2.0 of your application. This will remove the burden from the user of providing both files, although I would maintain that possibility, as it is included in your application version 1.0.

In any case, I have started to code the changes for allowing lomse user applications to define what is a beat. This will solve most of the cases without changing your application and your current approach.

@hugbug
Copy link
Contributor

hugbug commented Jul 8, 2018

I did the testing. Piano doesn't send time signature events.

But there is another interesting thing. I've inspected midi-files created by MuseScore from musicxml-files. These midi-files do contain time signature meta events!

I've made a test and removed these events from a midi-file. I've used 11a-TimeSignatures file which contains many different time signatures and which worked perfectly in Smart Pianist and in piano metronome. After removing events Smart Pianist were not able to display proper score. The piano played the song correctly (note on/off events are there after all) but metronome didn't work properly.

That's a nice finding! Both Smart Pianist and piano do use time signatures from midi-files. And not only use, they heavily rely on them.

@hugbug
Copy link
Contributor

hugbug commented Jul 8, 2018

I've checked time signature meta events in other files.

Below are the same tables as above but with new column Signature in midi-file

11c-TimeSignatures-CompoundSimple

Signature in musicxml Signature shown in Smart Pianist Metronome clicks Signature in midi-file
(3+2)/8 5/8 5 5/8
(5+3+1)/4 5/8 9 9/ 8

So the piano (metronome) uses time signature from midi meta event. It's strange however that Smart Pianist doesn't show proper signature for second measure (it doesn't show time signature on second measure as if it were not changed; therefore I've written 5/8 in the table).

11d-TimeSignatures-CompoundMultiple

Signature in musicxml Signature shown in Smart Pianist Metronome clicks Signature in midi-file
3/4 11/8 11 11/8
1/8 11/8 21 21/8

So the piano (metronome) uses time signature from midi meta event. It seems MuseScore isn't exporting correctly. Not a fault of piano then. Smart Pianist again doesn't show signature change on second measure.

11e-TimeSignatures-CompoundMixed

Signature in musicxml Signature shown in Smart Pianist Metronome clicks Signature in midi-file
3/4 11/8 2+11 11/8

Once again time signature isn't correctly exported.

@cecilios
Copy link
Member Author

cecilios commented Jul 9, 2018

The conclusion is clear to me: beat definition will depend on how the Midi file was generated and on the interest of the generator application in preserving time signature. Therefore, there is no solution in lomse for this. I will maintain the idea of allowing to define what is a beat, as it is useful for other purposes, such as fine control of metronome, and also could solve your problem for "standard" scores. Also, I will provide a method move_tempo_line_to(measure, location) and related, as seem to be more useful than the equivalent methods using measure/beat.

And for your application, for a better solution, you will need either:
a) to analyse the midi file, build a table of measures -> beat duration and use this information as follows for computing location (time displacement from start of measure) for each midi measure/beat message received,. And then use method move_tempo_line_to(measure, location) instead of using move_tempo_line_to(measure, beat). This method is not yet available but I will do it very soon.
b) or, to control midi file generation

@hugbug
Copy link
Contributor

hugbug commented Jul 9, 2018

Thank you for your continued work on this issue.

beat definition will depend on how the Midi file was generated and on the interest of the generator application in preserving time signature. Therefore, there is no solution in lomse for this.

It would be absolutely enough if Lomse could deal with Yamaha-beats. It's a responsibility of my app (or user) to provide proper midi- and musicxml-files. We can assume that midi-file contains time signatures and the musicxml-file is in sync with midi-file (both represent the same content and have same time signatures).

Another alternative would be that in your application the user selects a MusicXML file, and then your app will generate the midi file and send it to the piano.

Being able to select MusicXML files and automatically generate midi is certainly a good idea. Still midi is a primary format for other use case in my app. The piano has full set of MIDI voices and can play very complex midi-files. There are two intended use cases:

  1. Learning/ practicing classical repertoire. Native support of MusicXML is very welcome here. The quality of scores rendered from MusicXML can be better to anything achievable via midi-file analysis. Actually I think my app has a potential to became superior to Smart Pianist in this regard.

  2. Playing pop-music (entertainment mode). In Smart Pianist user can load a midi-file containing full song with many instruments including drums, guitars, etc. Such songs are available on many sites in the Internet. Then in Smart Pianist user selects a channel to play (usually but not necessary a piano track). User hits play button and piano plays all other channels (instruments) whereas the user plays his own part (piano) on piano keyboard. It feels like playing in a band, a lot of fun. Smart Pianist shows score for piano channel. To further assist in entertainment mode the piano has a feature called Stream Lights where LEDs light up above keys helping users who have troubles reading score. Then piano has guide-mode where it waits for user to hit correct key.

I need to (re)implement many features to make my app really useful. I'm afraid I will not be able to contribute to MIDI export feature, at least not in the foreseeable future.

I'm glad I've "invented" the trick with requiring two files, so that I can "outsource" a part of work to users instead of implementing midi decoding myself.

@cecilios
Copy link
Member Author

cecilios commented Jul 9, 2018

It would be absolutely enough if Lomse could deal with Yamaha-beats.

Current new methods using measure/beat are there. A new method Interactor::define_beat() will be provided in a couple of days so that applications using lomse can define how to interpret the concept of 'beat'. Changes while the score is being played back will be ignored until playback finishes.

Default value when nothing specified is k_beat_implied. Other implemented value are:

enum EBeatDuration
{
    k_beat_implied = 0,     ///< Implied by the time signature; e.g. 4/4 = four
                            ///< beats, 6/8 = two beats, 3/8 = one beat.
                            ///< The number of implied beats for a time signature is
                            ///< provided by method ImoTimeSignature::get_num_pulses().
                            ///< Basically, for simple time signatures, such as 4/4,
                            ///< 3/4, 2/4, 3/8, and 2/2, the number of beats is given by
                            ///< the time signature top number, with the exception of
                            ///< 3/8 which is normally conducted in one beat. In compound
                            ///< time signatures (6/x, 12/x, and 9/x) the number of beats
                            ///< is given by dividing the top number by three.

    k_beat_bottom_ts,       ///< Use the note duration implied by the time signature
                            ///< bottom number; e.g. 3/8 = use eighth notes. Notice
                            ///< that the number of beats will coincide with the
                            ///< time signature top number, e.g. 3 beats for 3/8.

    k_beat_specified,       ///< Use specified note value for beat duration.
};

After committing all these changes, I will also provide equivalent methods to those using measure/beat parameters, but using measure/location parameters; this could be used for instance, if your application would like to deal with beats and compute the position for better control.

I'm afraid I will not be able to contribute to MIDI export feature, at least not in the foreseeable future.

No problem. It was just an idea in case you would like to have total control on midi generation.

I'm glad I've "invented" the trick with requiring two files, so that I can "outsource" a part of work to users instead of implementing midi decoding myself.

Yes, this greatly simplifies the problem!

@hugbug
Copy link
Contributor

hugbug commented Jul 12, 2018

Beat positioning works perfect now!

Also tried the new tempo line adjustments - works as expected. And the auto scrolling is much improved as well.

Thank you very much!
Appreciate your hard work.

@cecilios
Copy link
Member Author

Thanks for the feedback!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants