Skip to content
Igor Zinken edited this page Jan 1, 2024 · 10 revisions

An Instrument is basically a convenient wrapping object which provides you with a unique AudioChannel (a "channel strip" on the mixer) and an interface to add AudioEvents to the Sequencer. Additionally, the AudioChannel provides the instrument with a unique ProcessingChain which can apply DSP effects solely to the instruments events.

An Instrument will take care of registering your events into the Sequencer, so the AudioEngine can render the contents of the instrument into audible output.

In the average application it is likely that each audio source that you desire to be a separate entity from the other sources, is registered into a unique Instrument. For instance, you can have a SynthInstrument for a bass synthesizer, another one for a lead synthesizer, a SampledInstrument for playing drumloops, etc. Anything that you basically want to be able to process / blend independently from other sources warrants its own instrument.

There is no limit on the amount of instruments you use in MWEngine, as long as the CPU of the device running the application can handle them!

BaseInstrument

Is the base class all instruments must extend.

constructor / destructor

BaseInstrument()

Will create a new Instrument instance, construct its AudioChannel, initialize its audioEvents and liveAudioEvents-vectors and register the Instrument inside the Sequencer (see registerInSequencer()).

~BaseInstrument()

When using C++: the destructor will invoke unregisterFromSequencer(), clear the events vectors and delete the instruments audioChannel and audioEvents and liveAudioEvents vectors. (NOTE : this will NOT delete the individual AudioEvent Objects, override this in your inheriting classes when necessary).

When using Java/Kotlin: see dispose() method below.

public methods

bool hasEvents()

Whether this instrument instance holds any events that can be collected by the Sequencer.

bool hasLiveEvents()

Whether this instrument instance holds any live events that should be made audible instantly by the Sequencer. These events are separated from sequenced events (see hasEvents()) for clearer separation.

std::vector<BaseAudioEvent*>* getEvents()

Returns a vector holding all the BaseAudioEvent instances registered to this instrument. These are queried by the Sequencer which will in turn have the AudioEngine render the events that are in the current Sequencer position / playback range for output.

std::vector<BaseAudioEvent*>* getLiveEvents()

Returns a vector holding all the BaseAudioEvent instances registered to this instrument that should be rendered instantly / live. These are queried by the Sequencer which will in turn have the AudioEngine render the events that are in the current Sequencer position / playback range for output.

void updateEvents()

Invoked by the AudioEngine whenever a time-related change (such as change in BPM or time signature) occurs. This method must be implemented by inheriting classes to update the instruments AudioEvents accordingly (for instance to update sampleStart, sampleEnd, sampleLength-properties to match the new tempo/time signature, etc.

void clearEvents()

Flushes all registered AudioEvents from the instrument (invokes removeEvent() on each individual event present in the event lists).

bool removeEvent( BaseAudioEvent* aEvent, bool isLiveEvent )

Convenience method to remove a single given aEvent from the instrument event list. Returns boolean value indicating whether the event has actually been removed. If you want to remove an event while the sequencer is playing, it is recommended to NOT invoke this method directly, but rather use setDeletable( true ) on the AudioEvent (meaning it will no longer be used by the instrument and can be disposed). The reasoning being that it is possible that the AudioEngine might be reading from the events buffers at the moment we request its removal. As such the deletion is actually postponed (by marking it as deletable) until the Sequencer is querying whether the event is eligible for playback. Only when it finds the event is marked as deleted, it will invoke this method. It is up to the instrument to determine what should happen (for instance invoke an actual destructor now it is safe to dispose the event, or keep the event in a pool if it is likely to be re-used at a later stage).

void registerInSequencer()

Will register the instrument instance inside the Sequencer. From this point on, the Sequencer will query the instrument for AudioEvents that are eligible for playback at the Sequencers current playback position. This method also has a safeguard to prevent the same instrument from being added multiple times.

void unregisterFromSequencer()

Will unregister the instrument instance from the Sequencer. This will ensure that the instruments AudioEvents are not played back by the Sequencer. This is likely to be used only when deleting the instrument, if you wish to temporarily silence the instrument, you can mute its AudioChannel as this will also omit unnecessary CPU usage.

public properties

AudioChannel* audioChannel

Pointer to the AudioChannel that is used to output the instruments contents.

int index;

The index inside the Sequencers instrument vector for this instrument.

void dispose()

When using Java/Kotlin: to be invoked when the instrument will be removed from the application. You can break references to this event so it will be garbage collected. This will also dispose of the created caches.

protected properties

std::vector<BaseAudioEvent*>* _audioEvents
std::vector<BaseAudioEvent*>* _liveAudioEvents

The vectors that will hold the sequenced and live events for the Instrument.

SampledInstrument

A SampledInstrument is the base Instrument class to use when all its AudioEvents are SampleEvents (i.e. their source holds a constant value, likely from an external file). A SampledInstrument has both sequenced and live events.

constructor / destructor

SampledInstrument()
~SampledInstrument()

SynthInstrument

A SynthInstrument describes an instrument that will synthesize its audio, rather than play back samples.

constructor / destructor

SynthInstrument()
~SynthInstrument()

Will delete the ADSR, RouteableOscillator and AudioEvent-vectors for both the sequenced and live events. Will also invoke BaseInstrument-destructor by default.

public methods

override

void updateEvents()

Will invoke the invalidateProperties (see SynthEvent) for both the sequenced and live events.

protected methods

void init()

Will construct new ADSR-, Arpeggiator- and RouteableOscillator-instances which can be used by the added AudioEvents to apply envelopes and additional effects to the synthesized content. Will also initialize public properties and construct the instruments AudioChannel and its vectors for holding the sequenced and live events.

public properties

int waveform
int octave
int keyboardOctave
float keyboardVolume

Properties of the synthesizer.

bool osc2active
int osc2waveform
float osc2detune
int osc2octaveShift
int osc2fineShift

Properties of the synthesizers secondary oscillator (see SynthEvent).

Arpeggiator* arpeggiator
bool arpeggiatorActive
RouteableOscillator *rOsc
ADSR* adsr

The modules that are applied to the synthesized contents of the AudioEvents.