Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do you use VexFlow? How well has it worked for you? #520

Open
Silverwolf90 opened this issue Jan 1, 2017 · 31 comments
Open

How do you use VexFlow? How well has it worked for you? #520

Silverwolf90 opened this issue Jan 1, 2017 · 31 comments

Comments

@Silverwolf90
Copy link
Contributor

Silverwolf90 commented Jan 1, 2017

Hi everyone, I wanted to get an idea of how everyone is using VexFlow. Particularly to help guide my contributions to the project. Some granularity in how your system hooks together is useful.

For a personal project, I used VexFlow as the formatting/rendering layer to build a music notation editing engine. I designed my own "model" layer, which was manipulated by a user, and I rebuilt/formatted/drew VexFlow objects whenever that model changed. Since my primary concern was prototyping iterating on a web-based notation editor, I was not concerned with flexibility around visual quality or styling. VexFlow worked quite well for this, but I also have a comprehensive understanding of the codebase.

At my current job at Tido, we built a dynamic notation renderer with a highly modified build of VexFlow. Again, VexFlow objects were generated from a data model. This rendering engine appears in the iPad app Mastering the Piano With Lang Lang. But even for Level 1 of the series (ie: basic educational piano), VexFlow proved to be inadequate for the quality and flexibility we required. So we only used it as a glorified glyph painter and wrote our own spanning elements (eg: hairpins, slurs) and our own formatting pipeline.

@gristow
Copy link
Collaborator

gristow commented Jan 2, 2017

In real life I direct the choirs at Oberlin Conservatory -- but on the side I'm trying to solve a bigger problem: more students abandon music degrees because they're unprepared for music theory & ear training than for any other reason.

So, I created uTheory, an interactive learning platform for music theory, rhythm and ear training.

VexFlow is great for this: rendering relatively simple notation from dynamically generated questions or user input. It makes a real-time Sibelius-like editing experience in the browser possible.

@Silverwolf90, you nailed the weakness: the spacing & collision issues involved in preparing more complex scores. We've done little to envision a magnetic or elastic type formatting structure which would probably be needed.

Anyway -- some snapshots of the ways I'm using VexFlow in uTheory are below.

And here's a video of it giving real-time feedback on a student rhythm performance.

chord-draw

key-signature

play-music

roman-numeral

@Silverwolf90
Copy link
Contributor Author

Silverwolf90 commented Jan 2, 2017

@gristow How are you storing these snippets? Is the notation "hardcoded" VexFlow, or are you generating VexFlow objects dynamically based on some sort of data?

@mturley
Copy link

mturley commented Jan 2, 2017

@Silverwolf90 I'm curious to hear more about your music notation editing engine. Is this a project you plan to commercialize? release source on github? pet project or work for an employer? what's the rest of your stack aside from vexflow?

I ask because I have been planning an elaborate/ambitious music notation and collaboration suite that I want to try and get off the ground, and I've been eyeing Vexflow as a renderer. Some of the discussion here around using it with React are interesting to me because I am planning to work with React.

Of course I plan to contribute back any core pieces that are useful to other Vexflow users, but I'm undecided on whether I would try to take a commercial angle or make it a community project. Perhaps we can work together on something. I have a lot of crazy ideas :)

@gristow
Copy link
Collaborator

gristow commented Jan 2, 2017

@Silverwolf90, similar to what you've done, I use a model layer (called Music internally) for representing music abstractly, with a bridge to VexFlow to render the model. The bridge takes a Music object and params on layout (viewbox dimensions, etc...) and then dynamically generates & draws VexFlow objects as needed based on that. (The VexFlow build is very slightly customized -- better rhythmic beaming, some additional custom glyphs and modifiers for giving feedback on answers. I've tried to share most of these changes upstream to VexFlow so the code bases don't drift too far apart.)

When I need to persist more complex questions (melodies, rhythm reading...) to a database the model can be stored or refreshed via an EasyScore-like string format we call MusicFile. (It has lots of additional metadata related to defining fixed vs editable elements and playback/dynamic/articulation information.)

(In addition to MusicFile, we can also create a Music model via realtime transcription from midi -- which is currently the fastest way for single line melodies/rhythms... and was a fun challenge to implement in a browser! This is the bones behind the live rhythm performance exercises in the browser.)

@Silverwolf90
Copy link
Contributor Author

@gristow thanks for the details, that's helpful

@mturley Sorry but I haven't worked on the project in years! Just wanted to bring it up in context :)

@cedvdb
Copy link

cedvdb commented Jan 6, 2017

I did a bridge from my sheet music class so vexflow can render them either in 4 measures lines for big screen or one measure that scroll while the sheet is playing. Reading other comments, I see everyone is doing the same and I know why. However I'm thinking of hosting my bridge so people don't have to re-write it each time and that bridge could actually be the vex flow API tbh. Vex Tab doesn't cut it, there is no grand staff support...

Let me say this: you guys did an awesome job on making things doable. Music notation is complex, I get it.

That being said, I have to say you are missing one layer of abstraction. "Make the simple easy, the complex possible", should be the mentra for any API - I strongly suggest to watch this if you want to make a better api https://www.youtube.com/watch?v=loj3CLHovt0 . Putting measures next to each other doesn't fall in the realm of complex, it should be a given. Telling the number of measures per line doesn't fall in the complex, it should be a given. This is needlessly low level.

Instead I have to calculate the position of each stave myself and do accordingly. This come with a lot of issues. The part of my application dealing with vexflow is by far the most complex, and I have a 300 lines class that deals with vexflow and it's a mess! (I could have split it but I wanted everything to stay in there.)

If you wanna take a look at my code I'd be happy to host it on github. However I'm not claiming it's well written, actually it's the worst piece of code I've ever written, so I plan on re-writting it.

That being said I want to make it clear that I understand the problem you tackle is a complex one and I think you guys managed the whole internals pretty well. However the end user doesn't have to know that. Ultimately he shouldn't realize it.

Some other things:

  • The position of the staves doesn't update with screen re-sizings. (I had to use transform:scale to still have the correct position while resizing the window).

Some API projects I wanted to start or somewhat started

  • Make a sheet api that bridges from sheet to the rendering as to have an usable user interface.
  • Make it json compatible, meaning a json file can be transformed into sheet and sheet into vexflow
  • Make a midi API that takes midi and transform it into the sheet.

@infojunkie
Copy link
Contributor

infojunkie commented Jan 7, 2017

Thanks for starting this discussion, it's really interesting to read about other members' use cases.

As for me, I am working on a sheet music player / musician's notepad app that aims at helping non-professional practitioners (like myself) study jazz pieces they are working on.

The idea is to start from fake sheets (à la Real Book) and be able to:

  • play the sheet (including auto-generated accompaniment using MMA)
  • perform transformations on the sheet (e.g. add a voice 6th below each note)
  • identify features of the music, e.g. tonics
  • change the tempo, the rhythm, the tuning, etc.

I have a very primitive demo running here: http://ethereum.karimratib.me:8080/ - choose the Bach piece for best results and hook the output to an existing MIDI synth on your local machine such as TiMidity. The MIDI music info for the Bach piece is fully parsed from the VexFlow structure.

Here's the demo's source code. Disclaimer: this code is most probably a throwaway prototype.

@PCrompton
Copy link

I have a very simple use case: display music flash cards one note at a time. I built this iOS app, which I plan to release once I iron out some problems: https://github.com/PCrompton/Play-That-Note.git

It renders well on iPhone in portrait mode, but on the iPad and iPhone landscape I need to figure things out.

@Latke89
Copy link

Latke89 commented Jan 13, 2017

I'm using this, along with a few other libraries, to try and create an online ear trainer. Still in it's early stages, and I'm still in the early stages of learning, but so far it's been a whole lot of fun and frustration. At the moment, I'm dynamically generating intervals, chords, and scales, and once I figure out how to work with it, I'd like to implement user input by clicking on the staff. This is what I have completed so far.

@andiejs
Copy link

andiejs commented Mar 1, 2017

Hi, Thanks for asking this question, Silverwolf, and sorry it took me
so long to respond!

I'm working on a system for doing computational music analysis. We
have a DSL in which to write analytic algorithms, and part of the
output can be generated score annotations. VexFlow is used to draw the
scores, and then layers of annotations are added.

There is a sublanguage for specifying drawings on a score, including coloring notes, boxes,
beams, lines, texts etc (right now only a small set). These can be hierarchically related so you can put
beams on boxes around beams etc. The same drawing scripts can be used
to annotate a piano roll as well.

I didn't implement the VexFlow interface and drawing system myself (I develop
the analytic system, language etc.), but I inherited the code 18 months
ago and have been maintaining / adding to it slowly since then.
At this point our score rendering doesn't take anything like the
full advantage of VexFlow's capabilities...

Some images are attached.

Andie

beams
c_min
detail2
dissonance
sets3
vic_good
z_schemas

@eliot-akira
Copy link
Contributor

eliot-akira commented Mar 19, 2017

Hi @Silverwolf90, thank you for your effort in further development of VexFlow, and opening up the discussion. Here is another use case, as more food for thought.

notation-screenshot

It's a training app for musicians (for now focused on guitar players), built on React, served by Node.js.

Each exercise is a set of text and MusicXML file(s) with notation and tablature. The XML is parsed and processed into a custom data format (stored as JSON), and fed into a component called Notation, which creates a DOM element reserved for VexFlow. The algorithm was written from scratch, to translate and render every single notation element from MusicXML, instantiating VexFlow classes like voices, staves, key/time signatures, notes, beams, ties, triplets.. I'm getting to know the inner workings of the library pretty well through this process.

Additional features built around VexFlow:

  • Responsive measures: adapt measure width based on number of notes, available screen width; redraw on window resize event
  • Notation player: soundfont/sample playback of (processed) notes with flexible tempo; optional looping, backing track/accompaniment
  • Highlight currently playing note, based on a reference of X/Y positions and all rendered note instances
  • Horizontal scroll: synced to the player, so the current note/chord is in the center of the screen; some fancy calculation was needed, based on total width of all measures, the available screen width, total duration of playback, and a ratio of pixels / milliseconds
  • Fretboard and keyboard components (showing finger positions) that are synced to player and notation
  • Utilities for music theory: generating scales/chords from given root, type, etc.

By far the most effort has been spent on manually rendering the data from MusicXML, using VexFlow's API. As others have noted, there are many undocumented features that I had to discover by diving into the code (which is an impressive work and organized, much respect to @0xfe and contributors!).

A few wishes for future development of VexFlow:

  • Backward compatibility
  • More thorough documentation and examples
  • Higher-level API, with sensible defaults built in, like adaptive height/widths
  • Modularity and maybe a build process that allows minimizing library size
  • VexFlow "plugins"? It seems that people using VexFlow are building various custom features on top, from which others could benefit. (Perhaps an API already exists for this and I'm just not aware of it.)

@amitgur
Copy link

amitgur commented Apr 25, 2017

Hello everyone

I'm using VexFlow in https://BandPad.co. BandPad is an online practice environment for Elementary school programs. Here is a score demo

I Started coding this in 2013, at that time I left the GitHub code and change some featured like the ties which I wrote by myself. And some more. On my plans, to integrate to the current Vexflow code.

Some of the things I implemented

  1. Responsive score: without changing the measure position in the page.
  2. Real-time rendering for creating the note indicator (when the playback is on)
  3. XML to vexflow implementation
  4. Pitch detection with WebAudioApi , working if you use the games or the trophy contest.

When I started doing this project I didn't know JS so my code is bad. I hope to fix it soon. Most of my problems were with designing the bars and page on the canvas and dealing with that Raphel vectors and fonts.

@khaschuluu
Copy link

Hi @eliot-akira. Your application looks (sounds) good. And how do you highlight (drawing cursor) on rendered score?

@eliot-akira
Copy link
Contributor

eliot-akira commented Aug 2, 2017

Thank you, @khaschuluu. For drawing the cursor, I placed an element with absolute position and transparent background, on top of the notation canvas; using that overlay element, the highlight bar is drawn as a border, with margin top/left to change position. Then, responding to note events from the player, I change the X/Y positions of the highlight (and also change the note color via VexFlow tickables).

The integration of MusicXML, sample player note events, and notation/highlight was complex, and still a work in progress. It would be great if there were modular, open-source collaboration on some of these features.

@khaschuluu
Copy link

Thank you for your answer @eliot-akira. I developing player it's highlighting score synchronously with playing audio. I need to develop highlighter (scrolling cursor) function that's giving time (as milliseconds or something like that) as argument and then draw/animate cursor to corresponding place on score. Tickable is may help me, but how do I know time relations between Ticks? Do you have some idea? I currently research about it.

@eliot-akira
Copy link
Contributor

eliot-akira commented Aug 6, 2017

@khaschuluu, for time relations between ticks, I extracted all note data (during MusicXML parsing) and converted note durations to "relative beat index", a number starting from 0, with 1 as a unit based on time signature, usually a quarter note. That beat index can be scaled by tempo to get the exact time in milliseconds. With that, I made a map of each tickable's beat index and X/Y position. The cursor function listens to note events from the sample/MIDI player, and, using the current beat index and map, figures out the cursor position and note highlights. I'm actually in the process of rewriting this (and other) parts of the application, so there may be better approaches.

@pellens
Copy link

pellens commented Jan 5, 2018

I have made my own little version of http://my.vexflow.com/, making it a little more easy to organize your tabs (the idea came from Tablab.io, but this site is down so I started a new project like this one...). Soon I will put it online for other people to play with!

One feature that would be really cool is the possibility to add "bass" tabs. Not only for guitar, but also for bassguitar. This way it is really easy for bands to write down their music as simple as possible...

Any thoughts on this?

@SalahAdDin
Copy link

@amitgur Man, is it your project open source?

@amitgur
Copy link

amitgur commented Jan 7, 2018

Nope

@SalahAdDin
Copy link

SalahAdDin commented Jan 7, 2018

Oh, unfortunately, @amitgur, i want make a similar project but for other instrument.

@deemaagog
Copy link
Contributor

deemaagog commented Mar 4, 2018

Hi everyone
Please take a look at my project
It's an interactive piano blues builder. The source code is open
Some features:

  • Responsive and scalable dynamic musical notation
  • Playback with tempo tuning and highlight currently playing note
  • Key transposition
  • Swinging eighth notes

@sschmidTU
Copy link
Contributor

sschmidTU commented Jul 31, 2018

We at OSMD (OpenSheetMusicDisplay) use VexFlow to render music xml files in the browser, parsed and converted to VexFlow.
OSMD is open source, and we often contribute improvements back to VexFlow.
You can try dropping your xmls into our Demo, though we lately made a lot of improvements like adding ornaments and grace notes, and fixing lyrics overlap. We will publish a new release soon.

Here's an example (Schubert - An die Musik), parsed from XML:
image

Our core parsing and modeling code was adapted from our iOS/Android/Windows app, PhonicScore/PracticeBird. We could have adapted that render engine for the browser as well, but wanted to choose an open source project for our own to have even more collaboration.

We always render with VexFlow multiple times, e.g. when resizing the browser, so we often find issues where shifting of notes gets applied each draw-call, resulting in misplaced elements. I fixed a few of these this month.

Currently we're working on adding expressions (e.g. dynamics), slurs, audio playback, a plugin infrastructure (so you can write plugins), etc.

Collaboration is always welcome!

@imiskolee
Copy link

@sschmidTU thanks for you job, i want to know why not integrate sound font on your project.

@sschmidTU
Copy link
Contributor

sschmidTU commented Aug 18, 2018

@imiskolee Audio Playback is on our ToDo list, have a look at OSMD's issues.

@mscuthbert
Copy link
Collaborator

Let's please not integrate sound playback into Vexflow -- that's a better thing to manage at a higher level than at the display level. Tutorials on how to do it would be helpful, but I don't think it should be in the graphics rendering engine. music21j has a part that handles vexflow display and part that handles MIDI.js playback and a system for keeping the two in sync. It seems better to do something like that than to try to have the graphics rendering system also keep track of what the current tempo is, which instrument is playing which line on a short-score, etc.

@sschmidTU
Copy link
Contributor

sschmidTU commented Aug 21, 2018

@mscuthbert Nobody wants to integrate sound playback into Vexflow, i don't think. If you read my previous comment, it should be clear that i'm talking about Audio Playback for OSMD, not Vexflow.
This thread is about the many ways in which people use Vexflow.

@tcdaly
Copy link

tcdaly commented May 15, 2020

I see audio playback of a score has been mentioned a few times in this thread. Has any open source software been developed that will do this, including a vertical line that visually indicates playback progress on the score?

@AaronDavidNewman
Copy link
Collaborator

AaronDavidNewman commented May 15, 2020 via email

@tcdaly
Copy link

tcdaly commented May 16, 2020

Thank you @AaronDavidNewman , look forward to checking it out. It looks like this team is also working on an audio playback feature this year: opensheetmusicdisplay/opensheetmusicdisplay#258 (comment)

@jzohrab
Copy link

jzohrab commented Apr 11, 2021

Hi all, I'm using Vextab for a simple tool called Tabbyhack that generates guitar tab from a user playing a guitar. The mic listens to it and generates candidate tab which the user then edits. I added vextab because it really looks cool. :-) Tabbyhack is deployed to GitHub pages: https://jzohrab.github.io/tabbyhack/ Thanks! jz

@simonbytez
Copy link

simonbytez commented Jul 31, 2021

truechops.com.

drum practice site.

You can input beats, play them back, save them, generate new ideas, and generate links.

Here are some links that I built out:

https://truechops.com/r/kZYolJB6G (drumline)
https://truechops.com/r/bjoRXytHO (snare lick)
https://truechops.com/r/JFdoO53M8 (snare dynamic lick)
https://truechops.com/r/RFQWj33q2 (dynamic drumset)

react, redux, mongodb, graphql, apolloclient, nextjs, tonejs.

Contact jared@truechops.com if you want to contribute.

@ronyeh ronyeh pinned this issue Nov 4, 2021
@ronyeh ronyeh changed the title What is your use case for VexFlow? How well has VexFlow worked for you? How do you use VexFlow? How well has it worked for you? Nov 4, 2021
@ronyeh ronyeh unpinned this issue Nov 6, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests