Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

some audio file can not be play. #89

Open
aazhou opened this issue Mar 16, 2014 · 36 comments
Open

some audio file can not be play. #89

aazhou opened this issue Mar 16, 2014 · 36 comments
Assignees
Labels
Milestone

Comments

@aazhou
Copy link

aazhou commented Mar 16, 2014

http://voice.fishsaying.com/FlwWDLc6I78cZ_MqrtUI0MqN4QpF?e=1395560221&token=p2DofS0KOhrEU1OXKk2X3fv2MapiWWjbbeAvnNtX:ha7Pm2DWf4WWHGDNuSVmo_plEKc=

hi, guy, this audio file can not be played with errorCode:STKAudioPlayerErrorStreamParseBytesFailed

@danube83
Copy link

I guess, it doesn't have extension like to mp3 in your url.

@aazhou
Copy link
Author

aazhou commented Mar 17, 2014

Actually, many of streaming audio files don't have extension, how it works?

danube83 notifications@github.com于2014年3月17日星期一写道:

I guess, it doesn't have extension like to mp3 in your url.


Reply to this email directly or view it on GitHubhttps://github.com//issues/89#issuecomment-37777472
.

Sent by My iPhone.

@tumtumtum
Copy link
Owner

This occurs if the file wasn't formatted for steaming (metadata packets supplied at the beginning of the file). These types of files are currently unsupported.

@aazhou
Copy link
Author

aazhou commented Apr 11, 2014

do you plan to support it?

@tumtumtum
Copy link
Owner

No plans at the moment. Will do one day when I have time or a need... :-)

@aazhou
Copy link
Author

aazhou commented Jan 8, 2015

@weepy
Copy link

weepy commented Nov 4, 2016

I have an app that uses Apple's m4a encoding to create files. Most stream fine through StreamingKit, but some do not - and fail with the same error above. Given that they are all created in the same way - it seems to imply a bug in StreamingKit ?

Here is a file that plays fine
https://firebasestorage.googleapis.com/v0/b/noiz-eee96.appspot.com/o/beatmakerboss-iv3ga91n-15mjxs8r_1478244129198.m4a?alt=media&token=25ffc143-8db7-44cb-af4f-c664e9a4567f

Here is a file that throws the UnexpectedError:

https://firebasestorage.googleapis.com/v0/b/noiz-eee96.appspot.com/o/max3dec-iv13fhc5-vea04g5f_1478214209542.m4a?alt=media&token=1aa4f2a8-e5ad-4a53-9291-b943fd7b4b63

Additionally the files stream absolutely fine in the browser.

Please advise ? Thanks!

@abdultcb
Copy link

Found something interesting ::

Have a look::

I am trying to use STKAudioPlayer to stream playback of audio files created with TAAE, and for the most part it is working great. But I have found that longer files do not stream properly, giving an error STKAudioPlayerErrorStreamParseBytesFailed which indicates the file has not been properly prepared for streaming. It seems AVAssetExportSession has an "shouldOptimizeForNetworkUse" property which puts certain file info at the beginning instead of the end of the file, and STKAudioPlayer relies on that data to stream properly. My guess is that the shorter files don't actually end up streaming because they fully load, but the longer ones hit this error.

Is there any way to save audio files in TAAE that meet STKAudioPlayer's streaming file criteria?

Thanks,
Adrian

@doriansgithub
Copy link

Man, this is a huge issue. All of our work is in jeopardy because we can't have our app not play some files just because they are longer.

@patrickjquinn
Copy link

Regardless of the fact that m4a encoded streams causing crashes and content errors being a massive problem for me personally and having a desperate desire to see this resolved; @tumtumtum is in no way obliged to support his open source project, it is without any form of warranty or contract (bar the GPL).

If we want these issues resolved we need to band together, dig deep and fix it ourselves.

@doriansgithub
Copy link

I agree, but since he created it, maybe it would be easier for him. But if not, then it is what it is.

@doriansgithub
Copy link

Can't we have a method to look at the size of the file, and if it's a local file, have it wait for it to completely load before attempting to stream? One thing is to stream a remote file, but why not be able to play local files without the streaming portion of the player?

@patrickjquinn
Copy link

Have you tried setting the buffer size of streaming kit to > 6 minutes and seeing if it fills the buffer?

@doriansgithub
Copy link

doriansgithub commented May 24, 2017 via email

@doriansgithub
Copy link

    .readBufferSize = 600,
    .bufferSizeInSeconds = 600,
    .secondsRequiredToStartPlaying = 700,
    .flushQueueOnSeek = YES,
    .enableVolumeMixer = YES,
    .equalizerBandFrequencies = {

Doesn't work.

@patrickjquinn
Copy link

patrickjquinn commented May 24, 2017 via email

@doriansgithub
Copy link

Yeah. No discernible difference in the gapless, but large files fail. And if it tries to play another large file right after the first failure, the next short song won't play unless you hit play.

@patrickjquinn
Copy link

Frustrating I know but maybe it's time to step through StreamingKit line by line and seeing if you can solve the issue yourself?

@doriansgithub
Copy link

Yes, I'll look at it in a couple of weeks and update. Thanks for all your help, truly admirable!

@iDevelopper
Copy link

Could you upload here a file you can't stream?

@bizibizi
Copy link

@bizibizi
Copy link

any update on this issue? I have the same one with url above, it does not play

@patrickjquinn
Copy link

patrickjquinn commented Nov 21, 2017 via email

@bizibizi
Copy link

bizibizi commented Nov 21, 2017

@patrickjquinn yeah, I installed that but this link still does not play

@bizibizi
Copy link

@doriansgithub any updates?

@doriansgithub
Copy link

doriansgithub commented Nov 21, 2017 via email

@bizibizi
Copy link

@doriansgithub crap, well, it's an option. But did you try some other libraries ? I guess it was something with the same features

@doriansgithub
Copy link

doriansgithub commented Nov 21, 2017 via email

@iDevelopper
Copy link

Found another thing interesting:

After exporting my file using AVAssetExportSession, convert the exported file in a PCM format, and StreamingKit can play this new file whatever its length:

#pragma mark Core Audio convert to pcm

// Generic error handler from upcoming "Core Audio" book.
// If result is nonzero, prints error message and exits program.
static void CheckResult(OSStatus result, const char *operation)
{
    if (result == noErr) return;
    
    char errorString[20];
    // see if it appears to be a 4-char-code
    *(UInt32 *)(errorString + 1) = CFSwapInt32HostToBig(result);
    if (isprint(errorString[1]) && isprint(errorString[2]) && isprint(errorString[3]) && isprint(errorString[4])) {
        errorString[0] = errorString[5] = '\'';
        errorString[6] = '\0';
    } else
        // no, format it as an integer
        sprintf(errorString, "%d", (int)result);
    
    fprintf(stderr, "Error: %s (%s)\n", operation, errorString);
}

- (void)convertToPCM:(NSURL *)url { // The url of the exported file from AVAssetExportSession
    NSLog (@"convertToPCM");
    
    // Open ExtAudioFile
    NSLog (@"Opening %@", url);
    ExtAudioFileRef inputFile;
    CheckResult (ExtAudioFileOpenURL((__bridge CFURLRef)url, &inputFile),
                 "ExtAudioFileOpenURL failed");
    
#ifdef CA_CANONICAL_DEPRECATED
    const int bytesPerSample = sizeof(SInt16);
#elif __IPHONE_OS_VERSION_MIN_REQUIRED >= 80000
    const int bytesPerSample = sizeof(SInt16);
#else
    const int bytesPerSample = sizeof(AudioSampleType);
#endif
    
    AudioStreamBasicDescription canonicalAudioStreamBasicDescription = (AudioStreamBasicDescription)
    {
        .mSampleRate = 44100.00,
        .mFormatID = kAudioFormatLinearPCM,
        .mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagsNativeEndian | kAudioFormatFlagIsPacked,
        .mFramesPerPacket = 1,
        .mChannelsPerFrame = 2,
        .mBytesPerFrame = bytesPerSample * 2 /*channelsPerFrame*/,
        .mBitsPerChannel = 8 * bytesPerSample,
        .mBytesPerPacket = (bytesPerSample * 2)
    };
    
    CheckResult (ExtAudioFileSetProperty(inputFile, kExtAudioFileProperty_ClientDataFormat,
                                         sizeof (canonicalAudioStreamBasicDescription), &canonicalAudioStreamBasicDescription),
                 "ExtAudioFileSetProperty failed");
    
    // Allocate a big buffer. size can be arbitrary for ExtAudioFile.
    // You have 64 KB to spare, right?
    UInt32 outputBufferSize = 0x10000;
    void *ioBuf = malloc (outputBufferSize);
    UInt32 sizePerPacket = canonicalAudioStreamBasicDescription.mBytesPerPacket;
    UInt32 packetsPerBuffer = outputBufferSize / sizePerPacket;
    
    // Setup output file
    NSString *outputPath = [[_appDelegate applicationDocumentsPath]stringByAppendingPathComponent: EXPORT_PCM_FILE];
    NSURL *outputURL = [NSURL fileURLWithPath:outputPath];
    NSLog (@"creating output file %@", outputURL);
    AudioFileID outputFile;
    CheckResult(AudioFileCreateWithURL((__bridge CFURLRef)outputURL,
                                       kAudioFileCAFType,
                                       &canonicalAudioStreamBasicDescription,
                                       kAudioFileFlags_EraseFile,
                                       &outputFile),
                "AudioFileCreateWithURL failed");
    
    // Start converting
    UInt32 outputFilePacketPosition = 0; //in bytes
    
    while (true) {
        // Wrap the destination buffer in an AudioBufferList
        AudioBufferList convertedData;
        convertedData.mNumberBuffers = 1;
        convertedData.mBuffers[0].mNumberChannels = canonicalAudioStreamBasicDescription.mChannelsPerFrame;
        convertedData.mBuffers[0].mDataByteSize = outputBufferSize;
        convertedData.mBuffers[0].mData = ioBuf;
        
        UInt32 frameCount = packetsPerBuffer;
        
        // Read from the extaudiofile
        CheckResult (ExtAudioFileRead(inputFile,
                                      &frameCount,
                                      &convertedData),
                     "Couldn't read from input file");
        
        if (frameCount == 0) {
            NSLog(@"done reading from file");
            break;
        }
        
        // Write the converted data to the output file
        CheckResult (AudioFileWritePackets(outputFile,
                                           false,
                                           frameCount,
                                           NULL,
                                           outputFilePacketPosition / canonicalAudioStreamBasicDescription.mBytesPerPacket,
                                           &frameCount,
                                           convertedData.mBuffers[0].mData),
                    "Couldn't write packets to file");
        
        NSLog(@"Converted %u bytes", (unsigned int)outputFilePacketPosition);
        
        // Advance the output file write location
        outputFilePacketPosition += (frameCount * canonicalAudioStreamBasicDescription.mBytesPerPacket);
    }
    
    // Clean up
    ExtAudioFileDispose(inputFile);
    AudioFileClose(outputFile);
    
    NSLog (@"Checking file at %@", outputPath);
    if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath]) {
        NSError *fileManagerError = nil;
        unsigned long long fileSize = [[[NSFileManager defaultManager] attributesOfItemAtPath:outputPath
                                                                                        error:&fileManagerError]
                                       fileSize];
        NSLog(@"File Size: %@", [NSString stringWithFormat: @"%lld bytes", fileSize]);
    } else {
        NSLog (@"No file at %@", outputPath);
    }
}

@doriansgithub
Copy link

doriansgithub commented Dec 2, 2017 via email

@iDevelopper
Copy link

iDevelopper commented Dec 2, 2017

I sent to you an email too! With the sample project.

@doriansgithub
Copy link

doriansgithub commented Dec 2, 2017 via email

@bizibizi
Copy link

bizibizi commented Dec 2, 2017

does it work? how to export a file?

@iDevelopper
Copy link

Yes it works from local iPod Library files.

Export from iPod Library: AVAssetExportSession class.

Sample:

STKSample-2.zip

@abdultcb
Copy link

abdultcb commented Dec 3, 2017 via email

@iDevelopper
Copy link

Another solution to avoid the use of AVAssetExportSession and directly export a song from the ipod library to PCM format file (using AVAssetReader and AVAssetWritter classes):

Example:

- (void)exportSongToPCM:(MPMediaItem *)song completion:(nullable void (^)(NSDictionary * _Nullable item))completion {
    // Set up an AVAssetReader to read from the iPod Library
    NSURL *assetURL = [song valueForProperty:MPMediaItemPropertyAssetURL];
    AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
    
    NSError *assetError = nil;
    AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:songAsset error:&assetError];
    if (assetError) {
        NSLog (@"Error: %@", assetError.localizedDescription);
        completion(nil);
        return;
    }
    
    AVAssetReaderOutput *assetReaderOutput = [AVAssetReaderAudioMixOutput assetReaderAudioMixOutputWithAudioTracks:songAsset.tracks audioSettings: nil];
    if (![assetReader canAddOutput: assetReaderOutput]) {
        NSLog (@"Can't add reader output...");
        completion(nil);
        return;
    }
    [assetReader addOutput: assetReaderOutput];

    NSArray *dirs = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectoryPath = [dirs objectAtIndex:0];
    NSString *exportPath = [documentsDirectoryPath stringByAppendingPathComponent:EXPORT_PCM_FILE]; // .caf
    if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
        [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
    }
    NSURL *exportURL = [NSURL fileURLWithPath:exportPath];
    
    AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:exportURL
                                                          fileType:AVFileTypeCoreAudioFormat
                                                             error:&assetError];
    if (assetError) {
        NSLog (@"Error: %@", assetError.localizedDescription);
        completion(nil);
        return;
    }

    AudioChannelLayout channelLayout;
    memset(&channelLayout, 0, sizeof(AudioChannelLayout));
    channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
    
    NSDictionary *outputSettings = @{
                                     AVFormatIDKey: @(kAudioFormatLinearPCM),
                                     AVSampleRateKey: @44100.0f,
                                     AVNumberOfChannelsKey: @2,
                                     AVChannelLayoutKey: [NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)],
                                     AVLinearPCMBitDepthKey: @16,
                                     AVLinearPCMIsNonInterleaved: @NO,
                                     AVLinearPCMIsFloatKey: @NO,
                                     AVLinearPCMIsBigEndianKey: @NO
                                     };

    AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio
                                                                              outputSettings:outputSettings];
    if ([assetWriter canAddInput:assetWriterInput]) {
        [assetWriter addInput:assetWriterInput];
    } else {
        NSLog (@"Can't add asset writer input...");
        completion(nil);
        return;
    }
    assetWriterInput.expectsMediaDataInRealTime = NO;

    [assetWriter startWriting];
    [assetReader startReading];
    
    AVAssetTrack *soundTrack = [songAsset.tracks objectAtIndex:0];
    CMTime startTime = CMTimeMake (0, soundTrack.naturalTimeScale);
    [assetWriter startSessionAtSourceTime: startTime];

    dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
    
    [assetWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock: ^{
         
         while (assetWriterInput.readyForMoreMediaData) {
             CMSampleBufferRef nextBuffer = [assetReaderOutput copyNextSampleBuffer];
             if (nextBuffer) {
                 // Append buffer
                 [assetWriterInput appendSampleBuffer: nextBuffer];
                 CMSampleBufferInvalidate(nextBuffer);
                 CFRelease(nextBuffer);
             } else {
                 // Done!
                 [assetWriterInput markAsFinished];
                 [assetWriter finishWritingWithCompletionHandler:^{
                     [assetReader cancelReading];
                     NSDictionary *item = @{@"Title" : [song valueForProperty:MPMediaItemPropertyTitle], @"Url" : [exportURL absoluteString]};
                     completion(item);
                 }];
                 // Just for log
                 NSDictionary *outputFileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:exportPath error:nil];
                 NSLog (@"Done. file size is %llu", [outputFileAttributes fileSize]);
                 break;
             }
         }
     }];
}

@diegostamigni diegostamigni self-assigned this Mar 20, 2019
@diegostamigni diegostamigni added this to the 0.1.31 milestone Mar 20, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

10 participants