Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The waveform is offset than actual. #8

Closed
DamonChen117 opened this issue Nov 30, 2018 · 2 comments
Closed

The waveform is offset than actual. #8

DamonChen117 opened this issue Nov 30, 2018 · 2 comments

Comments

@DamonChen117
Copy link

in the func

func process(_ samples: UnsafeMutablePointer<Int16>,
                         ofLength sampleLength: Int,
                         from assetReader: AVAssetReader,
                         downsampledTo targetSampleCount: Int) -> [Float] 

There is a line:
let samplesPerPixel = max(1, sampleCount(from: assetReader) / targetSampleCount)

samplesPerPixel is a Int, so it ignore Decimal, and it result there Is offset for waveform, and it accumulate the offset for each call to process.

@DamonChen117
Copy link
Author

My work around is :

fileprivate func extract(samplesFrom assetReader: AVAssetReader, downsampledTo targetSampleCount: Int) -> [Float] {
        var outputSamples = [Float]()

        var processedSampleLength = 0 // <---
        assetReader.startReading()
        while assetReader.status == .reading {
            let trackOutput = assetReader.outputs.first!

            if let sampleBuffer = trackOutput.copyNextSampleBuffer(),
                let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer) {
                let blockBufferLength = CMBlockBufferGetDataLength(blockBuffer)
                let sampleLength = CMSampleBufferGetNumSamples(sampleBuffer) * channelCount(from: assetReader)
                var data = Data(capacity: blockBufferLength)
                data.withUnsafeMutableBytes { (blockSamples: UnsafeMutablePointer<Int16>) in
                    CMBlockBufferCopyDataBytes(blockBuffer, atOffset: 0, dataLength: blockBufferLength, destination: blockSamples)
                    CMSampleBufferInvalidate(sampleBuffer)

                    let processedSamples = process(blockSamples,
                                                   ofLength: sampleLength,
                                                   from: assetReader,
                                                   downsampledTo: targetSampleCount)
                    
                    outputSamples += processedSamples

                    // start <---
                    processedSampleLength += sampleLength
                    
                    let samplesPerPixel:Float = max(1, Float(sampleCount(from: assetReader)) / Float(targetSampleCount))
                    let downSampledLength = Int(Float(processedSampleLength) / samplesPerPixel)
                    
                    let pad = outputSamples.last ?? -32
                    while outputSamples.count < downSampledLength
                    {
                        outputSamples.append(pad)
                    }
                    
                    while outputSamples.count >= downSampledLength
                    {
                        outputSamples.removeLast()
                    }
                    // end <---
                }
            }
        }
        var paddedSamples = [Float](repeating: silenceDbThreshold, count: targetSampleCount)
        paddedSamples.replaceSubrange(0..<min(targetSampleCount, outputSamples.count), with: outputSamples)

        return paddedSamples
    }

dmrschmidt pushed a commit that referenced this issue Mar 20, 2019
prevents undersampling; still getting a weird offset, related to #8
dmrschmidt pushed a commit that referenced this issue Mar 20, 2019
makes sure we're not leaving samples unprocessed as vDSP_desamp only works in strides. We keep the
unprocessed samples in the running buffer for the next assets reads.

We still have a small buffer potentially at the end which we'll need to consume to really parse the entire track. Coming soon.

#8, #12
@dmrschmidt
Copy link
Owner

@DamonChen117 thanks for submitting this. You're totally right, the code is missing a hwole bunch of samples in bigger files especially. Working on a fix currently. Almost there.

I introduced a running buffer now, which makes sure all samples are getting processed. While your solution looks like it will "catch up" on the lost samples, it only fills them with 0. With the running buffer now it's ensured that all samples are weighted correctly.

Currently just missing to process the final samples from file end and once I added that it should be good to go.

As a side effect this also fixes #12 already.

dmrschmidt added a commit that referenced this issue Mar 21, 2019
There may be unprocessed samples in the sample buffer after finalizing the initial processing-while-reading.
This would also result in less-than-required pixels being rendered. We backfill the missing buffer to be a
multiple of samplesPerPixel so vDSP_desamp will consume all of them.

finishes #8, #12
@dmrschmidt dmrschmidt mentioned this issue Mar 21, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants