GPUImageTransformFilter is adding some black frames at the beginning of the video #652

Open
fjvaldera opened this Issue Nov 19, 2012 · 3 comments

Comments

Projects
None yet
2 participants
@fjvaldera
Contributor

fjvaldera commented Nov 19, 2012

Hello Brad, first of all, your library is awesome and really useful! Thanks for the good work!

I am having some issues with the GPUImageTransformFilter. I am using the following chain:

GPUImageMovie -> GPUImageCropFilter -> GPUImageTransformFilter -> GPUImageAlphaBlendFilter -> GPUImageMovieWriter

I use the crop filter to make the video square size, the transform filter to rotate portrait videos so they are viewed in the right orientation in all players and the alpha blend to apply a mask to the video.

The problem is that the GPUImageTransformFilter is adding some black frames at the beginning of the video, it doesn't matter in which part of the chain I add this filter. If I remove the filter of the chain, no black frames appear on the resulting video.

Any ideas what the problem could be?

Thanks in advance.

jvaldera.

@fjvaldera

This comment has been minimized.

Show comment
Hide comment
@fjvaldera

fjvaldera Nov 19, 2012

Contributor

I have done further tests and using only the GPUImageTransformFilter causes the appearance of the black frames at the beginning.

I'll paste the piece of code doing the processing so the problem could be better tracked:

- (void)processVideoAtURL:(NSURL *)videoURL
{   
    // Original movie Asset
    AVURLAsset *originalAsset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
    // Movie reader
    GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:videoURL];
    [movieFile setPlayAtActualSpeed:NO];

// ------------------ TRANSFORM FILTER --------------------
    // Prepare the transformation filter.

    // Check video orientation.
    UIInterfaceOrientation videoOrientation = [self orientationForVideoAsset:originalAsset];

    CGAffineTransform transform;

    if (UIInterfaceOrientationIsPortrait(videoOrientation))
    {
        // Get the transform to rotate the video 90º clockwise.
        transform = CGAffineTransformMakeRotation(M_PI_2);
    }
    else
    {
        // Don't rotate the video
        transform = CGAffineTransformIdentity;
    }

    GPUImageTransformFilter *transformFilter = [[GPUImageTransformFilter alloc] init];
    [transformFilter setAffineTransform:transform];

    // Add transform filter to the chain.
    [movieFile addTarget:transformFilter];

// ---------------- MOVIE WRITER ----------------------
    // Create the movie writer
    NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
    unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
    NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
    GPUImageMovieWriter *movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(kVideoWidth, kVideoHeight)];

    // Add the movie writer to the chain
    [transformFilter addTarget:movieWriter];

    // Start the processing    
    [movieWriter startRecording];
    [movieFile startProcessing];

// Completion handling
    void (^writerCompletion)(void) = ^{
        NSLog(@"Completion block");
        [transformFilter removeTarget:movieWriter];
        [movieWriter finishRecording];

        // Create the composition
        AVMutableComposition *composition = [AVMutableComposition composition];
        // Create the audio and video tracks
        AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

        NSString *exportPath = [self generateMP4FilePath];
        NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:movieURL options:nil]; //[AVURLAsset assetWithURL:url];
        // Get the video track from the new asset
        NSArray *newAssetVideoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
        AVAssetTrack *newAssetVideoTrack = nil;
        if ([newAssetVideoTracks count] > 0)
        {
            newAssetVideoTrack = [newAssetVideoTracks objectAtIndex:0];
        }

        if (newAssetVideoTrack == nil)
        {
            NSLog(@"Error reading the transformed video track");
            return;
        }

        // Insert the tracks in the composition's tracks
        [videoTrack insertTimeRange:newAssetVideoTrack.timeRange ofTrack:newAssetVideoTrack atTime:CMTimeMake(0, 1) error:nil];
        AVAssetTrack *originalAudioTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [audioTrack insertTimeRange:originalAudioTrack.timeRange ofTrack:originalAudioTrack atTime:CMTimeMake(0, 1) error:nil];

        AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
        exportSession.outputURL = exportUrl;
        CMTime start = CMTimeMakeWithSeconds(0.0, 0);
        CMTimeRange range = CMTimeRangeMake(start, [asset duration]);
        exportSession.timeRange = range;
        exportSession.outputFileType = AVFileTypeMPEG4; // AVFileTypeMPEG4 or AVFileTypeQuickTimeMovie (video format);
        [exportSession exportAsynchronouslyWithCompletionHandler:^{
            switch ([exportSession status])
            {
                case AVAssetExportSessionStatusCompleted:
                {
                    NSLog(@"Export sucess");
                    dispatch_async(dispatch_get_main_queue(), ^{
                        UISaveVideoAtPathToSavedPhotosAlbum(exportPath, nil, nil, nil);
                        NSLog(@"Movie completed");
                    });
                    break;
                }
                case AVAssetExportSessionStatusFailed:
                    NSLog(@"Export failed: %@", [[exportSession error] localizedDescription]);
                    break;
                case AVAssetExportSessionStatusCancelled:
                    NSLog(@"Export canceled");
                    break;
                default:
                    break;
            }
        }];

        NSLog(@"processing finished");
    };

    [movieWriter setCompletionBlock:writerCompletion];
}
Contributor

fjvaldera commented Nov 19, 2012

I have done further tests and using only the GPUImageTransformFilter causes the appearance of the black frames at the beginning.

I'll paste the piece of code doing the processing so the problem could be better tracked:

- (void)processVideoAtURL:(NSURL *)videoURL
{   
    // Original movie Asset
    AVURLAsset *originalAsset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
    // Movie reader
    GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:videoURL];
    [movieFile setPlayAtActualSpeed:NO];

// ------------------ TRANSFORM FILTER --------------------
    // Prepare the transformation filter.

    // Check video orientation.
    UIInterfaceOrientation videoOrientation = [self orientationForVideoAsset:originalAsset];

    CGAffineTransform transform;

    if (UIInterfaceOrientationIsPortrait(videoOrientation))
    {
        // Get the transform to rotate the video 90º clockwise.
        transform = CGAffineTransformMakeRotation(M_PI_2);
    }
    else
    {
        // Don't rotate the video
        transform = CGAffineTransformIdentity;
    }

    GPUImageTransformFilter *transformFilter = [[GPUImageTransformFilter alloc] init];
    [transformFilter setAffineTransform:transform];

    // Add transform filter to the chain.
    [movieFile addTarget:transformFilter];

// ---------------- MOVIE WRITER ----------------------
    // Create the movie writer
    NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
    unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
    NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
    GPUImageMovieWriter *movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(kVideoWidth, kVideoHeight)];

    // Add the movie writer to the chain
    [transformFilter addTarget:movieWriter];

    // Start the processing    
    [movieWriter startRecording];
    [movieFile startProcessing];

// Completion handling
    void (^writerCompletion)(void) = ^{
        NSLog(@"Completion block");
        [transformFilter removeTarget:movieWriter];
        [movieWriter finishRecording];

        // Create the composition
        AVMutableComposition *composition = [AVMutableComposition composition];
        // Create the audio and video tracks
        AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

        NSString *exportPath = [self generateMP4FilePath];
        NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:movieURL options:nil]; //[AVURLAsset assetWithURL:url];
        // Get the video track from the new asset
        NSArray *newAssetVideoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
        AVAssetTrack *newAssetVideoTrack = nil;
        if ([newAssetVideoTracks count] > 0)
        {
            newAssetVideoTrack = [newAssetVideoTracks objectAtIndex:0];
        }

        if (newAssetVideoTrack == nil)
        {
            NSLog(@"Error reading the transformed video track");
            return;
        }

        // Insert the tracks in the composition's tracks
        [videoTrack insertTimeRange:newAssetVideoTrack.timeRange ofTrack:newAssetVideoTrack atTime:CMTimeMake(0, 1) error:nil];
        AVAssetTrack *originalAudioTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [audioTrack insertTimeRange:originalAudioTrack.timeRange ofTrack:originalAudioTrack atTime:CMTimeMake(0, 1) error:nil];

        AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
        exportSession.outputURL = exportUrl;
        CMTime start = CMTimeMakeWithSeconds(0.0, 0);
        CMTimeRange range = CMTimeRangeMake(start, [asset duration]);
        exportSession.timeRange = range;
        exportSession.outputFileType = AVFileTypeMPEG4; // AVFileTypeMPEG4 or AVFileTypeQuickTimeMovie (video format);
        [exportSession exportAsynchronouslyWithCompletionHandler:^{
            switch ([exportSession status])
            {
                case AVAssetExportSessionStatusCompleted:
                {
                    NSLog(@"Export sucess");
                    dispatch_async(dispatch_get_main_queue(), ^{
                        UISaveVideoAtPathToSavedPhotosAlbum(exportPath, nil, nil, nil);
                        NSLog(@"Movie completed");
                    });
                    break;
                }
                case AVAssetExportSessionStatusFailed:
                    NSLog(@"Export failed: %@", [[exportSession error] localizedDescription]);
                    break;
                case AVAssetExportSessionStatusCancelled:
                    NSLog(@"Export canceled");
                    break;
                default:
                    break;
            }
        }];

        NSLog(@"processing finished");
    };

    [movieWriter setCompletionBlock:writerCompletion];
}
@zktc5418

This comment has been minimized.

Show comment
Hide comment
@zktc5418

zktc5418 Feb 26, 2013

@jvaldera
exportSession.outputFileType = AVFileTypeMPEG4 not support on ios5

@jvaldera
exportSession.outputFileType = AVFileTypeMPEG4 not support on ios5

@fjvaldera

This comment has been minimized.

Show comment
Hide comment
@fjvaldera

fjvaldera Feb 27, 2013

Contributor

@zktc5418

That is not true. I have been using that output file type successfully in ios5 for a long time.

I have detected that only one black frame is added to the video, so in the export session I have omitted the first frame and it solved my problem. I just hope that no more black frames will appear in the future :D.

Contributor

fjvaldera commented Feb 27, 2013

@zktc5418

That is not true. I have been using that output file type successfully in ios5 for a long time.

I have detected that only one black frame is added to the video, so in the export session I have omitted the first frame and it solved my problem. I just hope that no more black frames will appear in the future :D.

@tuo tuo referenced this issue Feb 5, 2015

Closed

Fix movie writer #1913

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment