Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AEContext->frameIndex #7

Closed
32Beat opened this issue Apr 16, 2016 · 9 comments
Closed

AEContext->frameIndex #7

32Beat opened this issue Apr 16, 2016 · 9 comments

Comments

@32Beat
Copy link

32Beat commented Apr 16, 2016

I would like to see a UInt64 frameIndex added to the rendercontext which basically would be the equivalent of timestamp->mSampleTime minus the pitfalls of the AudioTimeStamp and floatingpoint bs.

@lijon
Copy link

lijon commented Apr 16, 2016

What's the problem with timestamp->mSampleTime?

@32Beat
Copy link
Author

32Beat commented Apr 16, 2016

That it is not always valid and then needs to be computed, and that it is a double indicating time, not the true integer index designating the sum of frameCount.

I believe it is much safer and easier to use an integer index, so that modules can quickly check for discontinuities, allowing them to reset certain internals. (For example I need to clear the ringbuffer after the subloops are stopped and restarted).

@lijon
Copy link

lijon commented Apr 16, 2016

How do you mean it's not always valid? A remoteIO unit render callback is always passed a valid mSampleTime as far as I know. Unless a badly behaving IAA host does not, when rendering its nodes.

I haven't seen any precision issues with the double. For example, this code detects buffer drops for me:

    Float64 lostFrames = inTimeStamp->mSampleTime-nextSampleTime;
    if(lostFrames > 1) {
        printf("*** glitch detected, lost %g frames (%g buffers)\n",lostFrames,lostFrames/inNumberFrames);
    }
    nextSampleTime = inTimeStamp->mSampleTime + inNumberFrames;

@32Beat
Copy link
Author

32Beat commented Apr 16, 2016

Unless the documentation states that kAudioTimeStampSampleTimeValid is always one, we are making assumptions, and you know the saying about "assumption"?
Then there remains the question whether SampleTime also means fractional values, so when do we lose a sample? lostFrames > 1.0? lostFrames >= 1.5? Or discontinuity? lostFrames < 0.5? etc...

I recall getting this issue with the AUFilePlayer, perhaps in combination with the Varispeed unit to play HR files on 44.1kHz. It will return a time, but not a sampleTime. Either way: badly documented structs need to be avoided like the plague. Especially if it is as simple as adding a dumb integer. (In addition to the fact that they should use an integer if they mean an integervalue, but it is likely a remnant of pre 64bit days, dunno)

@32Beat
Copy link
Author

32Beat commented Apr 16, 2016

I also was thinking previously that an AERange parameter would be useful in the context, so we can write some of those range-manipulations which makes life easier for sample-accurate start and stoptimes, not only for loops and files, but also for other parameters or notes. IF we're going to do that, my preference would be to make it an integer range:

UInt64 index
UInt64 count

(I also have a preference for equal length fields under some circumstances because it makes reading so much clearer...)

@michaeltyson
Copy link
Collaborator

I think this is probably going to add complications with what I see as little payoff. How about this: In AERendererRun, check the timestamp's kAudioTimeStampSampleTimeValid flag. If it's 0, provide our own.

@32Beat
Copy link
Author

32Beat commented Apr 23, 2016

I certainly would opt for your alternative, since that way the check (and possible computation) will be done once, and the validity is guaranteed. Having said that: the equality check on floats is more of a complication than adding some integers where they are appropriate. (A point which is reasonably well illustrated by your use of the FLT_EPSILON value etc.)

What complications are you thinking about otherwise? I currently have this:

void AERendererRun(__unsafe_unretained AERenderer * THIS, AudioBufferList * bufferList, UInt32 frameCount,
                   const AudioTimeStamp * timestamp) {

    // Reset the buffer stack, and set the frame count
    AEBufferStackReset(THIS->_stack);
    AEBufferStackSetFrameCount(THIS->_stack, frameCount);

    // Clear the output buffer
    AEDSPClearBufferList(bufferList, frameCount);

    // Run the block
    __unsafe_unretained AERenderLoopBlock block =
    (__bridge AERenderLoopBlock)AEManagedValueGetValue(THIS->_blockValue);
    if ( block ) {
        THIS->mContext.sampleRate = THIS->_sampleRate;
        THIS->mContext.frameCount = frameCount;
        THIS->mContext.frames = frameCount;
        THIS->mContext.timestamp = timestamp;
        THIS->mContext.output = bufferList;
        block(&THIS->mContext);
    }

    THIS->mContext.sliceIndex += 1;
    THIS->mContext.frameIndex += frameCount;
}

(frames is superfluous but relieves me of having to change it thru the entire codebase for now),

Both sliceIndex and frameIndex are useful integers to have during processing.

I wonder whether the following alternative for the main renderloop might help the discussions regarding TAAE2 so far:

void AERendererRun(__unsafe_unretained AERenderer * THIS, AudioBufferList * bufferList, UInt32 frameCount,
                   const AudioTimeStamp * timestamp) {

    // Reset the buffer stack, and set the frame count
    AEBufferStackReset(THIS->_stack, frameCount);

    // Run the block
    __unsafe_unretained AERenderLoopBlock block =
    (__bridge AERenderLoopBlock)AEManagedValueGetValue(THIS->_blockValue);
    if ( block ) {
        THIS->mContext.frameCount = frameCount;
        THIS->mContext.timestamp = timestamp;
        block(&THIS->mContext);
    }

    // Copy top-of-stack to output buffer
    AEBufferStackCopyToBuffer(THIS->_stack, bufferList, frameCount);

    THIS->mContext.sliceIndex += 1;
    THIS->mContext.frameIndex += frameCount;
}

@32Beat
Copy link
Author

32Beat commented Apr 23, 2016

Note also that this indicates a problem:

THIS->mContext.sampleRate = THIS->_sampleRate;

Apparently there is a synchronisation issue because the hierarchical structure of who-controls-what isn't well-defined. This might be a central controllerobject in which the samplerate is set, and it determines which objects need to be adjusted accordingly (or notified of this change). It might also be that the hierarchical relation between a renderer and context isn't well defined so this assignment is used to solve a deeper problem that exists elsewhere.

@michaeltyson
Copy link
Collaborator

What complications are you thinking about otherwise?

I'm not a big fan of having more than one way to do one thing, and the AudioTimeStamp is a Core Audio standard that already exists. Having two ways to monitor the time rankles a bit with me. I agree that the Float64 format for mTimeStamp is kinda annoying, but I think I'd rather that than have two separate places to access the sample time. I could be convinced, though...just not there yet =)

Note also that this indicates a problem:
THIS->mContext.sampleRate = THIS->_sampleRate;

Haha - I do concede that it totally violates what I just said about my preference for having only one way to do something. Providing the sample rate from the context and the renderer is an obvious duplication. Whoops =)

Redirecting back to our original thread about structure, with some more comments: http://forum.theamazingaudioengine.com/discussion/comment/2578/#Comment_2578

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants