Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tempo Drift on iPhones/iOS 8 but not iPad/iOS 9, possibly linked to mHostTime difference #20

Closed
djblake opened this issue Apr 1, 2016 · 8 comments

Comments

@djblake
Copy link
Collaborator

djblake commented Apr 1, 2016

Hey all,

This isn't exactly a Link issue per se, but I know of no other place (besides the Audiobus developer forums) where there are experts on Core Audio that might be able to offer some insight into this issue. It also could potentially affect other developers integrating Link.

I have implemented Link into my iOS app, and all is working great. However last night I noticed a slight tempo drift on my iPhone compared to my iPad. Upon further testing I observed that at 100bpm (6615.0 frames per 16th note), over the course of 1 minute, the iPhone would fall behind Ableton Link by about 0.038 beats (ie. the first beatTime value was 1.000 and after a minute the beat time was about 99.962). Which is about 15-16% of a 16th note. This seems to drift at a reasonably linear rate and would result in a noticeable discrepancy between the beats after a few minutes when comparing to another app running Link on the iPhone, or against my own app running on the iPad at the same time. Indeed running the app (identical code base) on my iPad Air 1 (iOS 9.0) produced rock solid results with a beat discrepancy of 0.000001 for each beat. Where as both my iPhone 6 (iOS 8.4) and iPhone 5 (iOS 8.1) produced this tempo drift by the same degree.

I figure it must be Core Audio related if the same code was producing two different results, and noticed that on my iPad the difference in value between the inTimeStamp->mHostTime at each callback was 139320, where as on my two iPhones the mHostTime difference between each callback was 139264.

139320 - 139264 = 56 less mHostTime on iPhones, which is 56/139320 = 0.00040195 as a percent

44100 * 60 = 2,646,000 frames per minute
2,646,000 * 0.00040195 = 1063.5597 frames drift

As a percentage of a 16th note (6615 frames), this is 1063.5597 / 6615 = 16.078% - suspiciously close to the 15-16% of 1/16th drift I observed after 1 minute.

So I then added some code into the app that multiplied the frames per 16th beat by a ratio of (139320 / 139264).
eg:

    Float64 newSamples = 44100.0 / ((newTempo * 4.0 )/60.0);  
                  // ( = 6615, where newTempo is 100 )
    newSamples = newSamples * (139320.0 / 139264.0);

This produces very consistent results with no tempo drift after many minutes!
beatTime = 217.999508 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000
beatTime = 218.999527 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000
beatTime = 219.999504 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000
beatTime = 220.999520 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000

However obviously I can't just apply this dodgy fix as I have no idea whats really causing it. Would a discrepancy in mHostTime actually cause this? (ie. does it mean the buffer is delivered to the speakers at an earlier time?). Is the cause iPhones vs iPad hardware or is it iOS 8 vs iOS 9? I would love to know what mHostTime results other people are getting on their devices (ie save the mHostTime between callbacks and each callback just do newHostTime - oldHostTime). Is no one else experiencing this issue? I have compared other Link enabled apps on my iPhone and they don't seem to suffer from any drift issues, however I am not sure whether or not this is because they are constantly recalibrating themselves based on Link's beat time, or whether they're just setting a new tempo once at the start and letting its do its thing until notified of any beat changes, as I am doing.

Anyway, sorry for the long explanation. Any insight anyone could provide would be greatly appreciated!

Cheers,
David

Edit: Just tested on my iPhone 4 / iOS 7.1.2, and although I can't run Link on it I'm getting a mHostTime difference of 139320 - like the iPad.

@lijon
Copy link
Collaborator

lijon commented Apr 1, 2016

I'm no expert, but interested in these things as well :)
A different delta in mHostTime should mean that the two devices clocks are running in slight different speeds, no? Also, I'd expect this delta to vary over time on most 32 bit devices, due to the jitter between mHostTime and mSampleTime that is evident on those devices.
I'm not sure how you're syncing to Link, but just following the tempo is not enough. Each buffer cycle you must check the actual beat time from Link passing it the current mHostTime (and adding your device output latency etc). That's the only clock source you can trust, if you want to keep sync with Link. Unfortunately, due to the jitter mentioned above, the amount of Link time passing between two buffers might vary even with a stable non-changing tempo, so that if you recalculate beat time into sample time, the delta sample time will not equal the buffer frame size. But even on devices with less jitter, the beat time might be adjusted and shifted when Link tries to keep all Peers in sync.

Getting a sample precise sampleTime out of Link is something I'd like to have myself, but haven't found a way to get that. In AUMs file player for instance, it uses its own stable sample clock and then adjusts the playback speed dynamically using a VariSpeed AU where the rate is controlled by the ratio between the ideal tempo and the actual Link tempo for the current buffer (which will almost never be the same as the current Link session tempo). But this means that recording exactly 4 beats at 120 bpm and 44.1khz does not yield exactly 88200 frames of audio, which is annoying!

/Jonatan

On 01 Apr 2016, at 19:52, djblake notifications@github.com wrote:

Hey all,

This isn't exactly a Link issue per se, but I know of no other place (besides the Audiobus developer forums) where there are experts on Core Audio that might be able to offer some insight into this issue. It also could potentially affect other developers integrating Link.

I have implemented Link into my iOS app, and all is working great. However last night I noticed a slight tempo drift on my iPhone compared to my iPad. Upon further testing I observed that at 100bpm (6615.0 frames per 16th note), over the course of 1 minute, the iPhone would fall behind Ableton Link by about 0.038 beats (ie. the first beatTime value was 1.000 and after a minute the beat time was about 99.962). Which is about 15-16% of a 16th note. This seems to drift at a reasonably linear rate and would result in a noticeable discrepancy between the beats after a few minutes, when comparing to another app running Link on the iPhone, or against my own app running on the iPad at the same time. Indeed running the app (identical code base) on my iPad Air 1 (iOS 9.0) produced rock solid results with a beat discrepancy of 0.000001 for each beat. Where as both my iPhone 6 (iOS 8.4) and iPhone 5 (iOS 8.1) produced this tempo drift by the same degree.

I figure it must be Core Audio related if the same code was producing two different results, and noticed that on my iPad the difference in value between the inTimeStamp->mHostTime at each callback was 139320, where as on my two iPhones the mHostTime difference between each callback was 139264.

139320 - 139264 = 56 less mHostTime on iPhones, which is 56/139320 = 0.00040195 as a percent

44100 * 60 = 2,646,000 frames per minute
2,646,000 * 0.00040195 = 1063.5597 frames drift

As a percentage of a 16th note (6615 frames), this is 1063.5597 / 6615 = 16.078% - suspiciously close to the 15-16% of 1/16th drift I observed after 1 minute.

So I then added some code into the app that multiplied the frames per 16th beat by a ratio of (139320 / 139264).
eg:

Float64 newSamples = 44100.0 / ((newTempo * 4.0 )/60.0);  
              // ( = 6615, where newTempo is 100 )
newSamples = newSamples * (139320.0 / 139264.0);

This produces very consistent results with no tempo drift after many minutes!
beatTime = 217.999508 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000
beatTime = 218.999527 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000
beatTime = 219.999504 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000
beatTime = 220.999520 bpm = 99.959805 samplesPerBeat = 6617.659984 quantime = 4.000000

However obviously I can't just apply this dodgy fix as I have no idea whats really causing it. Would a discrepancy in mHostTime actually cause this? (ie. does it mean the buffer is delivered to the speakers at an earlier time?). Is the cause iPhones vs iPad hardware or is it iOS 8 vs iOS 9? I would love to know what mHostTime results other people are getting on their devices (ie save the mHostTime between callbacks and each callback just do newHostTime - oldHostTime). Is no one else experiencing this issue? I have compared other Link enabled apps on my iPhone and they don't seem to suffer from any drift issues, however I am not sure whether or not this is because they are constantly recalibrating themselves based on Link's beat time, or whether they're just setting a new tempo once at the start and letting its do its thing until notified of any beat changes, as I am doing.

Anyway, sorry for the long explanation. Any insight anyone could provide would be greatly appreciated!

Cheers,
David


You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub

@djblake
Copy link
Collaborator Author

djblake commented Apr 3, 2016

Hey Jonatan,

Thanks a lot for your reply, it really helped me. I now call my resync frames function every callback instead of just when the tempo changes and it keeps it in perfect sync now. The amazing thing is even on the devices with the slightly slower clock speed, the Link session running on that device was running at the proper BPM rather than the slightly slower version, so it makes me wonder what they're calibrating their own time from!

Your Varispeed method sounds like quite a smart one, mine isn't quite so sophisticated. I basically adjust the frame count that has passed to where I would expect it to be based on Links beat. This of course means you could potentially go forward or back in time and miss or repeat notes, but so far I haven't observed that, all seems to work pretty well.

Uploading to the app store now so hopefully the update will be out in the wild next week :)

Cheers!
David

@cinkster
Copy link
Collaborator

Hi Guys,

Just glancing over your problem (sorry screaming child behind me!) it looks like this might be your issue?

Audio Host Time On iOS

Apologies if not, but I was just passing by and thought I'd at least try to help.

-Christian

@bangerang
Copy link

I basically adjust the frame count that has passed to where I would expect it to be based on Links beat.

How would one implement this in code? Do you have an example of your audio render callback? Would be deeply appreciated.

@djblake
Copy link
Collaborator Author

djblake commented Feb 2, 2017

Just been looking at the code - been a while since I wrote it now so took me a while to understand it! I can't really just paste you the audio render callback as it wouldn't make much sense, but I will paste what I deem to be the relevant bits of code and hopefully you will be able to adapt it to your own engine.

In my sound engine, the way I keep track of when the next notes in the sequencer should be triggered is keeping track of the frames that have past since the last 16th note, and knowing how many frames should exist between each 16th note. Once the exact number of frames for a 16th note has passed I fire the next one.

So say for example we expect there to be 10,000 frames for one 1/16th note. RemoteIOPlayer in this case is the singleton instance of my soundEngine

remoteIOPlayer.samplesPer16th` = 10000;

Get the beatTime at the start of the buffer. If you had a 4/4 beat you would expect this to be a value float value starting from 0.0 and increasing by 1.0 for every 4x 16th notes that have passed, so 1.0 == a 'beat', with 4.0 representing one bar has passed. At the start of your callback this gets the percent position in that 'beat' that Link expects you to be:

const Float64 beatTimeAtBufferStart = ABLLinkBeatTimeAtHostTime(
                                                                      linkData->ablLink,
                                                                      inTimeStamp->mHostTime + outputLatency);

In my sequencer, I keep track of how many frames have passed since the last 16th note (microFrameCount), and then once that 16th note is finished I reset microFrameCount back to 0 again. So if samplesPer16th was 10000 and microFrameCount was 5000, we'd know we were half way through playing the 16th note.

Make a note of our current microFrameCount. Unused in this example, but might be useful to keep track of we may want to know how much we adjusted microFrameCount for later:

CGFloat microFrameCountAtStart = remoteIOPlayer.microFrameCount ; 

Record how many 16ths in a beat...

CGFloat notesPerBeat = 4.0; 

One 16th note in beatTimeAtBufferStart would equal 0.25 (as 1.0 = a beat), multiply this out so that we're dealing with 16th notes not beats (so 1/16th = 1.0, 1 beat = 4.0):

CGFloat beatTimePerNote = beatTimeAtBufferStart * notesPerBeat; 

Get just the decimal part of beatTimePerNote so we have a percent progress Link expects us to be through the 16th note. (so say we are 20x 16th notes into our song, and just over half way through the current 16th note: 20.5002 - 20.0 = 0.5002

CGFloat beatTimeNoteProgress = beatTimePerNote - (NSInteger)beatTimePerNote; 

Now work convert that percent to frames: (10000 * 0.5002) = 5002

CGFloat targetFrameCountStart = remoteIOPlayer.samplesPer16th * beatTimeNoteProgress; 

Reset the microFrameCount (our frame count track of our frame progress through a 16th note) to be in line with what Link expects

remoteIOPlayer.microFrameCount = targetFrameCountStart ;

... so maybe our microFrameCount was 5000, now it might be 5002 or 4998 or something depending on how out from Link we were...

Can now compare microFrameCountAtStart to remoteIOPlayer.microFrameCount and make any other updates you need to accordingly to make sure your soundEngine will work in line with this new adjustment.

Let me know if you have any questions!

@bangerang
Copy link

Indeed very relevant info, makes perfect sense and I now have a much clearer view on how to proceed. Thank you David for this excellent explanation.

@djblake
Copy link
Collaborator Author

djblake commented Feb 3, 2017

Glad to be of help :)

@fgo-ableton
Copy link
Contributor

Seems to be solved

fgo-ableton pushed a commit that referenced this issue Nov 28, 2022
ObjC code cleanup: PART I - ABLLinkNotificationView

In this PR:
- I ran the automatic migrator and let XCode convert the source code to the latest Objective-C syntax.
- I fixed all the issues caused by the migration. Mostly the missing designated initializer hierarchy.
- I refactored ```ABLLinkNotificationView``` and made it a little bit clearer and readable.

Merge-requested-by: vst-ableton
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

5 participants