ofQTKitPlayer and ofQTKitGrabber to compile for 10.7 10.8 SDK #1478

Merged
merged 21 commits into from Aug 30, 2012

Projects

None yet
@obviousjim
Collaborator

This updates openFrameworks to compile for OS X 10.7 and 10.8 SDKs by bringing the ofxQTKit addons into the core.

For Video Playback the default is now ofQTKitPlayer instead of ofQuicktimePlayer in all cases.

Video Capture the new ofQTKitGrabber will be used in 10.7 and 10.8, while the legacy ofQuicktimeGrabber is used on 10.6 allowing access to the video settings dialog.

The video settings dialog is not available if compiling against 10.7 or 10.8.

Additionally there is an example "osxHighPerformanceVideoPlayerExample" that shows how to take advantage of some of the QTKit features (gpu decoding, fast scrubbing and alpha channel)

@obviousjim obviousjim was assigned Aug 10, 2012
@joshuajnoble
Collaborator

Win7 seems ok so far. Awesome work!

@ofTheo
Collaborator
ofTheo commented Aug 10, 2012

Great stuff James!

Just tested QTKit capture on 10.6 ( modifying the check in ofConstants.h )
Capture works well. Like the pop up warning about the settings window.

I feel like I should still be able to access macam devices with the QTKit grabber though? ( PS3 Eye doesn't show up in device list )

Is that something we can change to allow access to (if its available) without breaking things for 10.8 users?

From: http://macdaddyworld.com/2007/07/13/quicktime-72-brings-leopard-functionality/
"* QTKit Capture still works with existing QuickTime component vdig modules (macam still works). In this case, it is actually running the Sequence Grabber below CoreMedia. For the built-in iSight however, the Sequence Grabber is not being used...."

Also would be good if 10.7 users could choose to use QT7 grabber if that have 10.6 SDK installed.

@ofTheo
Collaborator
ofTheo commented Aug 10, 2012

whoops ignore the part about macam with QTKit - I just had to replug in the PS3 Eye and it showed up :)

@obviousjim
Collaborator

cool glad the PS3Eye showed up. As for 10.7 users, I imagined just choosing the SDK is what makes the switch. so 10.7 and 10.8 users can get the QT7 grabber if they set their OF project to compile against 10.6 SDK.

Should this be more explicit?

@damian0815 damian0815 commented on the diff Aug 11, 2012
examples/video/videoGrabberExample/src/testApp.cpp
@@ -49,6 +49,10 @@ void testApp::keyPressed (int key){
// use alt-tab to navigate to the settings
// window. we are working on a fix for this...
+ // Video settings no longer works in 10.7
@damian0815
damian0815 Aug 11, 2012

perhaps this should be a log message?

@obviousjim
obviousjim Aug 11, 2012 collaborator

when you call video settings it actually pops up a dialog box explaining the situation. so i think it will be clear

@damian0815 damian0815 commented on the diff Aug 11, 2012
libs/openFrameworks/utils/ofConstants.h
@@ -204,8 +204,12 @@ enum ofTargetPlatform{
#define OF_VIDEO_CAPTURE_GSTREAMER
#elif defined(TARGET_OSX)
-
- #define OF_VIDEO_CAPTURE_QUICKTIME
+ //on 10.6 and below we can use the old grabber
+ #ifndef MAC_OS_X_VERSION_10_7
@damian0815
damian0815 Aug 11, 2012

is this futureproof? if MAC_OS_X_VERSION_10_8 is defined, then is MAC_OS_X_VERSION_10_7 defined also?

@obviousjim
obviousjim Aug 11, 2012 collaborator

yes it's my understanding that all previous versions are defined as well as the current one

@damian0815 damian0815 and 1 other commented on an outdated diff Aug 11, 2012
libs/openFrameworks/video/QTKitMovieRenderer.m
+@implementation QTKitMovieRenderer
+@synthesize movieSize;
+@synthesize useTexture;
+@synthesize usePixels;
+@synthesize allowAlpha;
+@synthesize frameCount;
+@synthesize justSetFrame;
+@synthesize synchronousScrub;
+
+- (NSDictionary*) pixelBufferAttributes
+{
+ return [NSDictionary dictionaryWithObjectsAndKeys:
+ //if we have a texture, make the pixel buffer OpenGL compatible
+ [NSNumber numberWithBool:self.useTexture], (NSString*)kCVPixelBufferOpenGLCompatibilityKey,
+ //in general this shouldn't be forced. but in order to ensure we get good pixels use this one
+ [NSNumber numberWithInt: kCVPixelFormatType_32ARGB], (NSString*)kCVPixelBufferPixelFormatTypeKey,
@damian0815
damian0815 Aug 11, 2012

(being picky) i'm assuming you've tested performance other formats? in my experience with iOS and kCVPixelFormatType_*, BGRA was significantly faster than all other options, but i am officially ignorant re: OSX.

@obviousjim
obviousjim Aug 11, 2012 collaborator

I can try BGRA again -- but in my testing I remember clearly having to go with ARGB.

I think YUV is actually better but the manual conversion process is too since we aren't uploading directly to texture but need pixel level access. this was the best simplicity/efficiency trade off.

@damian0815 damian0815 commented on an outdated diff Aug 11, 2012
libs/openFrameworks/video/ofQTKitGrabber.mm
+ while ((videoConnection = [videoConnectionEnumerator nextObject])) {
+ if(verbose) ofLogVerbose("Video Input Format: %s\n", [[[videoConnection formatDescription] localizedFormatSummary] cStringUsingEncoding:NSUTF8StringEncoding]);
+ }
+
+ NSEnumerator *audioConnectionEnumerator = [[audioDeviceInput connections] objectEnumerator];
+ QTCaptureConnection *audioConnection;
+ while ((audioConnection = [audioConnectionEnumerator nextObject])) {
+ if(verbose) ofLogVerbose("Audio Input Format: %s\n", [[[audioConnection formatDescription] localizedFormatSummary] cStringUsingEncoding:NSUTF8StringEncoding]);
+ }
+
+ [self startSession];
+ }
+ return self;
+}
+
+- (id) initWithoutPreview:(NSInteger)_videoDeviceID audiodevice:(NSInteger)_audioDeviceID usingAudio:(BOOL)_useAudio
@damian0815
damian0815 Aug 11, 2012

can these init functions be folded together somehow? seems like a lot of duplicated code...

@damian0815 damian0815 and 1 other commented on an outdated diff Aug 11, 2012
libs/openFrameworks/video/ofQTKitGrabber.mm
+
+- (id) initWithWidth:(NSInteger)width
+ height:(NSInteger)height
+ videodevice:(NSInteger)videoDeviceID
+ audiodevice:(NSInteger)audioDeviceID
+ usingAudio:(BOOL)_useAudio;
+
+- (id) initWithoutPreview:(NSInteger)_videoDeviceID
+ audiodevice:(NSInteger)_audioDeviceID
+ usingAudio:(BOOL)_useAudio;
+
+- (void) outputVideoFrame:(CVImageBufferRef)videoFrame
+ withSampleBuffer:(QTSampleBuffer *)sampleBuffer
+ fromConnection:(QTCaptureConnection *)connection;
+
+- (bool) setSelectedVideoDevice:(QTCaptureDevice *)selectedVideoDevice;
@damian0815
damian0815 Aug 11, 2012

api around choosing devices is unclear. what does it mean for a device to be 'selected'? what is the difference between setVideoDeviceID and setSelectedVideoDevice? where do the IDs for setVideoDeviceID come from? this could do very very much with inline comments in the header.

@obviousjim
obviousjim Aug 12, 2012 collaborator

setVideoDeviceID is the user facing API while setSelectedVideoDevice is really an internal method, so perhaps i should bring it into the .m file

@damian0815
damian0815 Aug 12, 2012

right... spot the guy who doesn't work with video much (me) :-)

@damian0815 damian0815 and 1 other commented on an outdated diff Aug 11, 2012
libs/openFrameworks/video/ofQTKitGrabber.h
+ vector <string> & listAudioCodecs();
+ void setVideoCodec(string videoCodecIDString);
+ void setAudioCodec(string audioCodecIDString);
+ void setUseAudio(bool bUseAudio);
+ void startRecording(string filePath);
+ void stopRecording();
+ bool isRecording();
+ bool isReady();
+
+ void listDevices(); // would be better if this returned a vector of devices too, but requires updating base class
+ vector <string> & listAudioDevices();
+ vector <string> & listVideoDevices();
+
+ void close();
+
+ void setDeviceID(int videoDeviceID);
@damian0815
damian0815 Aug 11, 2012

where do device id's come from? i am assuming listDevices but this could perhaps do with a comment here.

@obviousjim
obviousjim Aug 12, 2012 collaborator

This is exactly the same way that it worked in the old video grabber - but sure a comment never hurts

@damian0815

mostly looks awesome, just a few ignorant/obsessive-compulsive questions inline. i'd like to see a bunch more comments in the header files also, but that extends back to the ofBaseVideoPlayer/ofBaseVideoGrabber base classes anyway.

@obviousjim
Collaborator

Ok i'll go through and try to clean this up a bit. Thanks for the feedback Damian

@benben
Member
benben commented Aug 12, 2012

I can confirm that it compiles and runs fine on OS X 10.7.4, XCode 4.3.3, both 10.6 and 10.7 SDKs

@bangnoise bangnoise commented on an outdated diff Aug 13, 2012
libs/openFrameworks/video/QTKitMovieRenderer.m
+}
+
+//writes out the pixels in RGB or RGBA format to outbuf
+- (void) pixels:(unsigned char*) outbuf
+{
+ if(!self.usePixels || _latestPixelFrame == NULL){
+ return;
+ }
+
+ CVPixelBufferLockBaseAddress(_latestPixelFrame, 0);
+ unsigned char* pix = CVPixelBufferGetBaseAddress(_latestPixelFrame);
+
+
+ //NOTE:
+ //CoreVideo works on ARGB, and openFrameworks is RGBA so we need to swizzle the buffer
+ //before we return it to an openFrameworks app.
@bangnoise
bangnoise Aug 13, 2012

consider vImageConvert_ARGB8888toRGB888() and vImagePermuteChannels_ARGB8888() - much faster than doing this manually

@ofTheo
Collaborator
ofTheo commented Aug 13, 2012

thanks for the tip @bangnoise !!

@bangnoise

On 2nd and 3rd thoughts:

  • if the aim is to provide buffers in a specific channel-order, it is likely faster to request that format directly from QTKit in the visual context
  • if you can do that, it would greatly improve performance if you avoid the copy entirely and wrap the original correctly-ordered CVPixelBuffer in a custom ofPixels subclass
@obviousjim obviousjim A number of video player fixes
-- Fixed crash and removed warning for non-square pixel videos. Still need a better solution
-- updated setLoopState to use enum
-- updated setVolume to new 0072 float standard
-- fixed a crash caused by loading audio files into the video player
abae31f
@obviousjim
Collaborator

I added a few commits based on bugs I was finding while converting my projects over to this player.

I still have to go through and integrate @bangnoise's suggestions. That'll finish this off I think. But the more testers in the meantime the less headaches we'll have when this gets into develop.

re @damiannz's comments I am going to make an OS X specific video-capture example that shows how to take advantage of the recording API that I think was uncommented and causing some confusion.

stay tuned, thanks for the input everyone.

@obviousjim
Collaborator

Ok I added two more commits to best address the issues @bangnoise brought up

  • For RGB video playback we request 24RGB pixel format and can do a straight memcpy to the ofPixels
  • For RGBA video playback we use the vImage call to swap the pixels around from ARGB (which is the only thing that CoreVideo Supports
  • For video capture I switched from ARGB to just RGB and things and can do a straight memcpy into the ofPixels, testing only on my iSight though. @jvcleave would you be able to test this on a BlackMagic card at a high resolution (1920x1080) to make sure RGB is still performant?

The reason I can't see a way to do a custom ofPixels subclass as @bangnoise suggests is because many oF apps do videoGrabber.getPixels() which returns an unsigned char*. CoreVideo requires calls to Lock and Unlock base addresses to surround pixel access. I'm not sure to reconcile this with the API.

But these changes should speed things up and definitely simplifies the code.

The last thing to change is cleaning up the video grabber code per Damian's suggestions

@bangnoise

You'll have to check the CV buffer's bytes per row and copy line by line when there's a mismatch with the destination

Collaborator

@bangnoise good call - can that be detected by checking CVPixelbufferGetBytesPerRow(cvFrame) != pixels.getWidth()*3 ?

if (CVPixelBufferGetBytesPerRow(cvFrame) == pixels.getWidth() * 3)
{
    memcpy(pixels.getPixels(), CVPixelBufferGetBaseAddress(cvFrame), pixels.getHeight() * pixels.getWidth() * 3);
}
else
{
    char *dst = pixels.getPixels();
    size_t dstBytesPerRow = pixels.getWidth() * 3;
    char *src = CVPixelBufferGetBaseAddress(cvFrame);
    size_t srcBytesPerRow = CVPixelBufferGetBytesPerRow(cvFrame);
    size_t copyBytesPerRow = MIN(dstBytesPerRow, srcBytesPerRow); // should always be dstBytesPerRow but be safe
    for (int y = 0; y < pixels.GetHeight(); y++)
    {
        memcpy(dst, src, copyBytesPerRow);
        dst += dstBytesPerRow;
        src += srcBytesPerRow;
    }

Written in the browser, forgive certain mistakes

Collaborator

cool makes sense.

@bangnoise

you'll need to copy line by line when there's a mismatch between CV and destination bytes-per-row

@bangnoise

logic is bad here - if self.usePixels and self.useTexture are both true you leak and lose the visualContext you just created above

gah ignore me, didn't see the else, sorry!

Collaborator

word - good catch.

Collaborator

Wait - No i dont' think so

if(usePixels){

  • create QTPixelBufferContextCreate with _visualContext
    if(also use texture){
    • CVOpenGLTextureCacheCreate
      }
      } else if( use texture) { //only using texture
    • create QTOpenGLTextureContextCreate with _visual context

}

the visual context is in the else

Collaborator

oops ^_^ ignored

@jvcleave
Member

Re: Blackmagic, I only have the BM H.264 Pro recorder which I am having to write an addon for but I do have a Logitech C910 (http://www.amazon.com/Logitech-HD-Pro-Webcam-C910/dp/B003M2YT96) that I can try out Monday

@obviousjim
Collaborator

OK I've cleaned up the Grabber/Recorder based on Damian's comments, and also added an example that shows how to do video recording.

please take a look and let's get this merged soon!

@obviousjim

This is the only part I'm unsure of -- the call to stopRecording() is asynchronous, so we need to wait for this callback before the movie is ready to be loaded into the video player. We could make a custom ofEvent for it, but I tried this for starters

Collaborator

So the message only gets sent when the file is finished? Or is it happening immediately at the moment?

Collaborator

The message gets sent when the video is finished writing to disk and can be loaded by a video player

ie you can't do this:
videoRecorder.stopRecording();
videoPlayer.loadMovie("justRecordedVideo.mov");

see this
7cfb0e4#L2R170

Thinking more i dont' think this is the right fix, it needs to be a custom event. If the app is sending other messages around.

Collaborator

ahh got it.
yeah was also thinking a custom event makes sense.
but nice that you have it working async already.

Collaborator

Should it look like this:

ofEvent videoSaved;

ofAddListener(videoRecorder.videoSaved, this, &testApp::videoSaved)

and the string == the filename.

do we need a way to distinguish which recorder it came from?

we don't really have a choice to do it asynchronously i think, it's how QTKit works...

Collaborator

i mean, other than asynchronously

Collaborator

ofEvent is a template so you could have a custom arg type.
which could store the player ptr, the path etc.

ie: ofEvent videoSavedEvent;

@artutoc any thoughts?

i'd suggest something like this:

typedef struct { bool succcess; string url; } VideoSavedEventArgs;
ofEvent<VideoSavedEventArgs> savedEvent;

can never have too many args.

i mean, one can never have too many args :-)

Collaborator

typedef struct { bool succcess; string url; void* sender } VideoSavedEventArgs;
ofEvent savedEvent;

add the sender?

Collaborator

sorry - my ie: was meant to actually show the custom type.
I think it was stripped out by GH markdown :)

but yeah something like what @damiannz was suggesting is what I was going for.
not sure if you would also want a pointer to the videoGrabber object as well?

re: sender, this is IMO a weak point in the oF events api. @arturoc can explain it better, but there is also a method that passes on the sender to the listener methods, but i think the signature of the listener method has to change to accomodate it, ie it's not automatic.

@ofTheo ofTheo commented on the diff Aug 21, 2012
libs/openFrameworks/video/ofQtUtils.cpp
@@ -1,6 +1,6 @@
#include "ofQtUtils.h"
-#if defined (TARGET_WIN32) || defined (TARGET_OSX)
+#if defined (TARGET_WIN32) || (defined TARGET_OSX && !defined(MAC_OS_X_VERSION_10_7))
@ofTheo
ofTheo Aug 21, 2012 collaborator

doesn't this make it hard for someone on 10.7 to use the legacy QT7 player/grabber if they have 10.6 SDK installed?
maybe it would be better to have our own define - OF_ENABLE_LEGACY_QUICKTIME - or something like that?

@obviousjim
obviousjim Aug 21, 2012 collaborator

If they set the Xcode SDK to 10.6 compile they get the legacy player, it was designed as a transparent way to switch to the new version going forward

but 'm happy to do it with OF_ENABLE_LEGACY_QUICKTIME - except that if you are compiling against 10.7 or 10.8 that flag won't work

@ofTheo
ofTheo Aug 21, 2012 collaborator

ah okay - didn't realize that the MAC_OS_X_VERSION_10_7 isn't define on 10.7 running 10.6 SDK.
nevermind then - what you propose is fine!

@damian0815

how about ofLogError("ofQTKitGrabber") << "Failed to add delegate to capture session: error was " << [[error description] UTF8String];

@damian0815

and again, ofLogError("ofQTKitGrabber") << " Selected audio device ID (" << _audioDeviceID << ") out of range (" << audioDevices.count << ")" sometimes it's out of range because there are 0 devices, sometimes it's out of range because _audioDeviceID was -5138769123 :-)

Collaborator

wait you saw this number show up?

no, i just mean if i'm passing in a variable from outside that i've forgotten to initialize so its value is whatever was in the memory before.

Collaborator

ok well the check is more robust now

@damian0815

again, << [[error description] UTF8String];

@damian0815

if [self stopRecording] has just been called, do we need to wait until the finished event has fired before calling this?

Collaborator

no - because it will either be a) called with a different file name, so the other one can safely finish writing to disk in the background
or
b) the same file name, in which case the videoSaved event will report the proper error (just tested this)

@damian0815

looks great, all my questions are cosmetic except the last one about [self stopRecording] asynchronicity.

@bilderbuchi
Member

guys I just have to say I love this PR discussion! ❤️ It really shows "the way it should be". :-)
btw @ofTheo, you mistyped @arturoc in that comment above. Arturo, when you see this, go here.

@obviousjim
Collaborator

@damiannz have you run the examples and test the functionality as well?

I think what is important with this is testing on a lot of different hardware and cameras. so please whoever is following this thread pull, run, and stress test. I'm doing as much as i can but I have limited hardware

@obviousjim obviousjim Added event callback
fixed and added error logging
added more comments to example
a362e26
@obviousjim
Collaborator

phew ok, next one committed. should have addressed all issues from the last comment.

@damiannz @ofTheo please take a look.

@damian0815

sorry, should've caught this before. you need a - (void) dealloc{} on the class that explicitly releases all these retained properties, or you'll leak memory..

Collaborator

oh dear me - yes

@bakercp
Member
bakercp commented Aug 21, 2012

Hi all, just wondering -- perhaps I missed something in my read of the above discusion, but from the original PR post, it seems the goal for this to be a drop-in replacement for ofVideoPlayer. If so, there are several settings and behaviors that are quite different -- particularly surrounding the SynchronousScrubbing behavior. My use scenario is one where loading video frame pixels into a buffer in a separate thread, then uploading them to textures as they become available from the main loop. In the external loop -- something like this:

// this all happens in a thread
        // Load all of the frames in a non-texture
        for(int i = 0; i < fingerMovie.getTotalNumFrames(); i++) {
            fingerMovie.setFrame(i);
            img->setUseTexture(false);
            img->setFromPixels(fingerMovie.getPixelsRef()); 
            imgs.push_back(img);
        }
    }

Then this happens back in the main thread

        for(int i = 0; i < imgs.size(); i++) {
            imgs[i]->setUseTexture(true);
            imgs[i]->update();
        }

Then I use it as needed.

When using the default ofVideoPlayer (not the ofQTKitPlayer type), I don't have access to setSynchronousScrubbing() which I need to execute something like this:

// this all happens in a thread
        // Load all of the frames in a non-texture
        for(int i = 0; i < fingerMovie.getTotalNumFrames(); i++) {
            fingerMovie.setFrame(i);
            fingerMovie.update(); // this is required to fill the pixels needed for getPixelsRef() belo
            img->setUseTexture(false);
            img->setFromPixels(fingerMovie.getPixelsRef()); 
            imgs.push_back(img);
        }
    }

A small solution to this problem -- with the goal of making the default settings match other platform ofVideoPlayers might be to set the default member synchronousScrub in QTKitMovieRenderer to YES (in the loadMovie method).

Then, perhaps do something like :

void ofQTKitPlayer::setFrame(int frame) {
    if(moviePlayer == NULL) return;
    NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
    moviePlayer.frame = frame % moviePlayer.frameCount;
    [pool release];
       if(getSynchronousScrubbing()) {
               update();
       }
}

To get that existing behavior back.

Just a note -- the pixels that are uploaded are being interpreted in odd ways in here unless you set fingerMovie.setPixelFormat(OF_PIXELS_RGBA); (it's not RGBA).

@obviousjim
Collaborator

Hey Christopher,

Ah you are right, I mean to default it synchronous scrubbing to "YES" - the purpose of putting that scrubbing param was so that it default to replicate the ofQuicktimePlayer behavior but that if you wanted to you could turn it off. I'll fix that!

I'll check in a moment - but on the existing player don't you need to call ->update manually on a video before those pixels become available after setting the frame? Or does it happen right away on setFrame(). I will test and make sure it works the same as it used to.

Regarding the RGBA pixels note, are you seeing it look weird in the video that's displayed?

The idea with setting the pixel format is that QTKit will force those pixel format set when opening the video. so if that alpha is not set it wil ask Quicktime for 24RGB format, and do a straight copy into ofPixels, otherwise it'll do 32ARGB and switch it to 32RGBA - Let me know if you're seeing weird looking pixels coming through and how to replicate it.

yay!

@bakercp
Member
bakercp commented Aug 22, 2012

Hi @obviousjim -- here's a mini project to reproduce the RGBA issue referenced above.

Toggle the effect with this line:

    ///////////////////////////////////
    bool bToggleStrangeBehavior = true;
    ///////////////////////////////////

And screen grab

Screenshot

@bakercp
Member
bakercp commented Aug 22, 2012

Also, note in the following section of the mini project

...
        fingerMovie.setFrame(i);
        fingerMovie.update();

        ofImage img;
...

fingerMovie.update() must be called to get any pixels.

@obviousjim
Collaborator

Wow Christopher, you really found a black hole.

So the problem is that when calling frameImageAtTime: (which is how the Synchronous Scrubbing works) the PixelFormat cannot be enforced. Even though the movie was opened with 24RGB, the pixel frames coming back are 32ARGB.

http://lists.apple.com/archives/quicktime-api/2008/Nov/msg00024.html

Seems to suggest that this is nothing that can be done. @bangnoise or @vade may know a trick here. otherwise i"ll figure out a way to handle ARGB ->RGB conversion for the time being.

Honestly I would really like to avoid having frameImageAtTime: there all together, but it seems that is the only way I've been able to synchronously ensure pixels come back after a call to [movie setTime:]

@bakercp
Member
bakercp commented Aug 22, 2012

A black hole indeed. Was just looking in OpenCV's QtKit code -- particularly at the CvCaptureFile class, and it looks they use frameImageAtTime: and CVPixelBufferRef but they always read it out as ARGB then take it to BGRA then drop the A to get a BRG IplImage, which is returned. So ... not much help there, but it is an interesting (but slow?) strategy.

Any thoughts on the associated update() call to fill the pixels situation?

@vade
vade commented Aug 22, 2012

I'm on vacation and just had a harrowing drive through the mountains at night. Fucking crazy. Anyway, are you using "session mode" options for frameImageAtTime withAttributes? Ill check the code and see if anything obvious pops out at me.

I've never seen the Attributes dictionary when specifying a format not return it.

That said I don't doubt it happens :)

://:; mbl.dev.

On Aug 21, 2012, at 8:15 PM, Christopher Baker notifications@github.com wrote:

A black hole indeed. Was just looking in OpenCV's QtKit code -- particularly at the CvCaptureFile class, and it looks they use frameImageAtTime: and CVPixelBufferRef but they always read it out as ARGB then take it to BGRA then drop the A to get a BRG IplImage, which is returned. So ... not much help there, but it is an interesting (but slow?) strategy.

Any thoughts on the associated update() call to fill the pixels situation?


Reply to this email directly or view it on GitHub.

@obviousjim obviousjim updates to address changes from pull request discussion
Synchronous scrubbing now on by default
Synchronous scrubbing of RGB videos now correctly handles ARGB frames with vImagePermuteChannels_ARGB8888
Objective-C grabber now has destructor
changed video event to use abstract ofBaseVideoGrabber type for sender to support other sender types using the same event in the future
5a30c64
@obviousjim
Collaborator

all of this should be addressed

@damiannz I added the destructor
@bakercp now you don't need to call update(), it calls it internally. i confirmed that this is how the old video player used to work. nice catch. also vImageConvert_ARGB8888toRGB888 per @bangnoise request fit the bill to fix your pixel bug. again, nice find. thanks so much for testing this.

@bakercp
Member
bakercp commented Aug 22, 2012

Hi @obviousjim -- I've been working on trying to integrate this with an existing project that tends to use a lot of disk-based random access via .setFrame() when something like ofxVideoBuffer isn't an option. I realize the underlying architecture of QTKit is quite different and it is far superior in most tasks, but it right now calls to .setFrame() are as much as 50 times slower than the legacy player when used as a drop-in replacement (i.e. no special QTKit high performance options are applied). Player load and close times are comparable between QTKit and legacy. It may just be a matter of setting different defaults to make them more comparable, or it may be related to all of the extra ARGB swizzlin' associated w/ frameImageAtTime:, but I'm wondering if it goes a bit deeper than that -- perhaps you have some thoughts? You can find my test code here. It's not an extremely well designed unit test or anything, but it should demonstrate the basic random access speed issue. To use it, just toggle your branch back to the current develop and comment / uncomment lines 22/23 of testApp.cpp.

@obviousjim
Collaborator

Yeah this is an issue. QTKit just isn't designed for synchronous random access... even for small movies calls to setFrame() disrupt seem to disrupt it's flow flow, on top of that we have to have ensure the pixels are immediately available which is complicating everything. But there must be something to fix this it to be at least equivalent to what we had before.

I think I may be able to explore dipping down into the Movie object on setFrames to get the the pixels rather than calling frameImage may be the solution.

@obviousjim
Collaborator

Just out of curiosity: if you turn off synchronous scrubbing does is it still slower? my question would be whether it's the frameImage call or even just QTKit itself

@bakercp
Member
bakercp commented Aug 22, 2012

if you set .setSynchronousScrubbing(false) it cuts function calls into half of the time required by the legacy player. But, you get no pixels if you ask for a pixel ref. I believe https://github.com/openframeworks/openFrameworks/pull/1478/files#L13R231 puts a stop to it.

@obviousjim
Collaborator

yeah because the call to set time says "hey QTKit, i'd like a frame at some point, whenever you've feel like it"

The frame does come in, but perhaps a few update cycles later. there needs to be some way other than frameImageAtTime: to get frames synchronously

@ofTheo
Collaborator
ofTheo commented Aug 25, 2012

@bakercp - is the setFrame call completely random - or is it often used to step through a movie at a custom speed?

on stackoverflow there is a suggestion to call frameImageAtTime and then continue with [movie stepForward]
http://stackoverflow.com/questions/5669350/real-time-way-to-extract-all-video-frames-in-yv12-yuv420-format-from-quicktime

I'm imagining the frameImageAtTime is slow as with motion based compression you have to seek to a reference frame and then process the frames up until the one requested.

I'm imagining we could do something really clever with setFrame() in that if it detects the frame requested was one more the current frame, it would use movie stepForward otherwise it would use frameImageAtTime ( slower )

not a perfect fix - but could speed things up quite a bit for some situations.

@bakercp
Member
bakercp commented Aug 25, 2012

@ofTheo no -- my use is rarely completely random. I believe you are correct about motion-based compression + random access using frameImageAtTime. I only used that random access as a test. I think that the "smart" step detection using stepForward would be a nice solution. If possible, it would be nice to match the random access of the legacy ofPlayer though. Or perhaps at some time in the future, if people really need true blazing fast random access, they can use some kind of imaginary ofVideoBuffer object.

@obviousjim
Collaborator

Alright - took another stab at synchronous updating.

First I switched everything to an off thread "frame available" call back to make the Visual Context tell me when it had a frame. Then each call to MovieTask() results in a frame coming back. If synchronous is on, that thread safely blocks until the right frame comes in.

Is working in my tests and seems much faster!

Would love for people to throw it into more situations, as again it's a big structural change from before... we'll get there.

@obviousjim
Collaborator

@bakercp I ran your speed test again, and while the legacy player is still faster, they are in the same ballpark.

  • legacy ofQuicktime player with a small video takes an average of about 1500 micros to do a setFrame()
  • the new one takes about 3400 micros

Testing on a large video. (H.264 1080p encoded by a Canon 5D)

  • the legacy player takes about 35,000 micros
  • the new player takes about 55,000 micros, but sometimes as slow as 71,000 (average for 1000 calls)

However if the new player is opened as texture only, it is about twice as fast as the legacy player (17,583 micros per access) I know this doesn't help you in your loops though, but is worth mentioning.

To put it in perspective a frame at 30fps is 33,000 micros.

It's a bummer that for a certain use case the newer API's are slower, but I think this difference is in the acceptable range now.

@bakercp
Member
bakercp commented Aug 27, 2012

@obviousjim the random access test is significantly faster. But, I just tried a new test (which includes grabbing the pixels) and it doesn't produce pixels like the legacy player. Here's the test https://gist.github.com/3485582.

One observation - in the QTKit version, the main loop fps gets locked to video fps (~30fps). In the legacy version, it runs in the 400+fps range for me and shows the pixels that were transfered to an ofImage (and auto uploaded to its texture).

I haven't dug into your latest updates yet -- so I don't have any suggestions yet. Hoping to check it out a little more tomorrow.

@obviousjim
Collaborator

OK, the reason you aren't getting pixels is the new player needs an update() call to push the pixels from CoreVideo into ofPixels. This is a bug - so I'll force an update when synchronous mode is on to replicate old behavior.

The reason it's locking the framerate is because of the call to play() on the video in your test. So the video is trying to play forward at a natural rate, yet keeps getting interrupted by calls to set frame. Take out the call to play() and the framerate jumps up to 200fps, which reflects the difference in the speed tests from yesterday.

It's a balance of expected behavior and performance... I'm up for just calling stop when that first setFrame comes through(). I can try to pause internally and play again after returning the frame to see if this helps too.

just for fun I tried texture only mode, and the framerate is well up to 500fps, so it can be faster i promise!!

@bakercp
Member
bakercp commented Aug 28, 2012

Just pulled your latest. I'm still working on isolating my thread-timeout issue, but in the mean time just noticed that the QTKit ofVideoGrabber will not report a device list until after .initGrabber is called. In the legacy player, a list was available pre-initialization.

    cout << "LISTING DEVICES" << endl;
    cout << "------------" << endl;
    vidGrabber.listDevices(); // prints nothing

    vidGrabber.setVerbose(true);
    vidGrabber.initGrabber(camWidth,camHeight);

    cout << "LISTING DEVICES" << endl;
    cout << "------------" << endl;
    vidGrabber.listDevices(); // prints a device

For QTKit using the above code in setup (in the vidGrabberExample)

LISTING DEVICES
------------
DVFreeThread - CFMachPortCreateWithPort hack = 0x125f4b0, fPowerNotifyPort= 0x125f190
DVFreeThread - CFMachPortCreateWithPort hack = 0x741550, fPowerNotifyPort= 0x7444c0
DVFreeThread - CFMachPortCreateWithPort hack = 0x73f280, fPowerNotifyPort= 0x7449c0
LISTING DEVICES
------------
2012-08-28 00:17:02.396 videoGrabberExampleDebug[74855:707] 0 - FaceTime HD Camera (Built-in)
2012-08-28 00:17:02.397 videoGrabberExampleDebug[74855:707] 

For legacy on the same machine and test app, I get:

LISTING DEVICES
------------
[warning] ofBaseVideoGrabber::setPixelFormat not implemented
DVFreeThread - CFMachPortCreateWithPort hack = 0x1511db0, fPowerNotifyPort= 0x1511b10
[notice] -------------------------------------
DVFreeThread - CFMachPortCreateWithPort hack = 0x121e480, fPowerNotifyPort= 0x121e460
[notice] listing available capture devices
[notice] (unavailable) device[0] DV Video
[notice] (unavailable) device[1] IIDC FireWire Video
[notice] device[2] USB Video Class Video - FaceTime HD Camera (Built-in)
[notice] -------------------------------------
DVFreeThread - CFMachPortCreateWithPort hack = 0x727160, fPowerNotifyPort= 0x7275d0
[notice] ofQuickTimeGrabber: attempting to open device[2] USB Video Class Video   -   FaceTime HD Camera (Built-in)
[notice] ofQuickTimeGrabber: device opened successfully
[warning] ofQuickTimeGrabber: no camera settings to load
[notice] end setup ofQuickTimeGrabber
[notice] -------------------------------------

LISTING DEVICES
------------
[notice] -------------------------------------
DVFreeThread - CFMachPortCreateWithPort hack = 0x121fda0, fPowerNotifyPort= 0x121fa60
[notice] listing available capture devices
[notice] (unavailable) device[0] DV Video
[notice] (unavailable) device[1] IIDC FireWire Video
[notice] device[2] USB Video Class Video - FaceTime HD Camera (Built-in)
[notice] -------------------------------------

Also, I'd count it as a feature, but QtKit doesn't list unavailable devices like the legacy player. Like I said, that seems like a feature I think.

More on the player timeout issue soon.

@obviousjim obviousjim updated time value to reflect current frame rather than movie interna…
…l time-- makes this more accurate for syncing effects
b59e98e
@kylemcdonald

PRs don't have labels, but i feel like it's fair to consider this one 'critical' for 0072.

@ofTheo
Collaborator
ofTheo commented Aug 30, 2012

yup - is this good to merge @obviousjim ?

@obviousjim
Collaborator

It's working really stable for me - it hasn't been as thoroughly tested as
I'd like on different platforms, but has been running stable for me and
Christopher besides a few small non-critical things to address (mentioned
above)

I am comfortable merging - there may be some hot fixes needed as it hits a
lot of different codecs, platforms etc.

On Fri, Aug 31, 2012 at 1:52 AM, Theodore Watson
notifications@github.comwrote:

yup - is this good to merge @obviousjim https://github.com/obviousjim ?


Reply to this email directly or view it on GitHubhttps://github.com/openframeworks/openFrameworks/pull/1478#issuecomment-8166099.

  • James
@ofTheo ofTheo merged commit 660291c into openframeworks:develop Aug 30, 2012
@ofTheo
Collaborator
ofTheo commented Aug 30, 2012

merged!
time to test!!!!

:)

@bakercp
Member
bakercp commented Aug 30, 2012

Congrats all! Indeed, now time to test. :)


http://christopherbaker.net

On Thu, Aug 30, 2012 at 12:56 PM, Theodore Watson
notifications@github.comwrote:

merged!
time to test!!!!

:)


Reply to this email directly or view it on GitHubhttps://github.com/openframeworks/openFrameworks/pull/1478#issuecomment-8168401.

@ofTheo
Collaborator
ofTheo commented Aug 30, 2012

@obviousjim - it seems this PR mildly broke video on windows.

I think it was something to do with setPixelFormat which ofQTPlayer didn't implement. Adding setPixelFormat and getPixelFormat to ofQTPlayer fixes this - but it might be an issue also for linux users too.

maybe @bilderbuchi @arturoc could see if the videoPlayerExample looks weird on linux? ( bytes per pixel looked messed up ).

have a fix for Win - just going to do a bit more testing.

@ofTheo ofTheo commented on the diff Aug 30, 2012
libs/openFrameworks/video/ofVideoPlayer.cpp
@@ -11,6 +11,7 @@
//---------------------------------------------------------------------------
void ofVideoPlayer::setPlayer(ofPtr<ofBaseVideoPlayer> newPlayer){
player = newPlayer;
+ internalPixelFormat = player->getPixelFormat();
}
@ofTheo
ofTheo Aug 30, 2012 collaborator

this was what broke windows.
getPixelFormat was not implemented by ofQuickTimePlayer so internalPixelFormat was being set to junk.

simple fix is to make ofBaseVideoPlayer return OF_PIXELS_RGB unless overridden.

@ofTheo ofTheo added a commit that referenced this pull request Aug 30, 2012
@ofTheo ofTheo Fix for #1478 which started using getPixelFormat(). default format wa…
…s set to OF_PIXELS_MONO - Changed default to OF_PIXELS_RGB. We should still implement setPixelFormat and getPixelFormat in the non base players / grabbers.
af4bc00
@bilderbuchi
Member

just tested the videoPlayerExample: video playback colours look correct (right video shows some ascii-shader-like version). example interaction behaviour is weird - using frame-selection by arrows works only erratically; end of the movie seems to confuse the app, becomes unresponsive; automatic reset to frame 0 seems irregular as to when it happens, etc.

@obviousjim
Collaborator

On Linux or os x?

On Fri, Aug 31, 2012 at 5:30 PM, Christoph Buchner <notifications@github.com

wrote:

just tested the videoPlayerExample: video playback colours look correct
(right video shows some ascii-shader-like version). example interaction
behaviour is weird - using frame-selection by arrows works only
erratically; end of the movie seems to confuse the app, becomes
unresponsive; automatic reset to frame 0 seems irregular as to when it
happens, etc.


Reply to this email directly or view it on GitHubhttps://github.com/openframeworks/openFrameworks/pull/1478#issuecomment-8185642.

  • James
@bilderbuchi
Member

sorry, on linux, as per Theo's request above.

@bilderbuchi
Member

OK, so I checked the videoPlayerExample's behaviour at 0071 release point, and also immediately before this PR got merged. Both show behaviour identical to each other, but different to current develop (i.e. with this PR included).

Before merge: Frame selection with arrows works fine. playback with mouse works fine at positive speeds, not at all at negative speeds. at negative speeds it resets to frame 0 after some random time (often with an intermediate position at frame ~50)
when the end of the video is reached, after a while, and wiggling the mouse a bit, it gets reset to frame 0.

After merge:
using frame-selection by arrows works only erratically; end of the movie seems to confuse the app, becomes unresponsive; negative speed doesn't play; automatic reset to frame 0 seems irregular as to when it happens, etc

It's a bit difficult to describe precisely. :-( Part of the problems at least could be from the example's logic itself; I haven't looked at that.
All this on Ubuntu 12.04 32 bit.

@ofTheo
Collaborator
ofTheo commented Aug 31, 2012

huh - that is weird @bilderbuchi , as I just looked through the diff and I don't see anything that might cause changes in behavior on linux.

The only thing that might affect was the setPixel / getPixelFormat stuff - which should be fixed.

I'll check on Windows and see what the 0071 vs develop behavior is in the videoPlayerExample

@bilderbuchi
Member

Yes, James was similarly surprised (and I agree). I say, let's get some other linux users' opinions/experiences, maybe it is just my machine or whatever.

@underdoeg underdoeg commented on the diff Aug 31, 2012
libs/openFrameworks/video/ofVideoPlayer.cpp
@@ -21,6 +22,9 @@ void ofVideoPlayer::setPlayer(ofPtr<ofBaseVideoPlayer> newPlayer){
//--------------------------------------------------------------------
void ofVideoPlayer::setPixelFormat(ofPixelFormat pixelFormat) {
internalPixelFormat = pixelFormat;
+ if( player != NULL ){
+ player->setPixelFormat(internalPixelFormat);
+ }
@underdoeg
underdoeg Aug 31, 2012 collaborator

These lines broke linux. video still plays but frame stepping for example is broken because of the pixel format

@obviousjim
obviousjim Aug 31, 2012 collaborator

we can take this out, with the risk of the expected behavior of pixelFormats not working.

Perhaps it should just throw an error that you aren't allowed to change the pixel format after the video is already set up?

@underdoeg
underdoeg Aug 31, 2012 collaborator

It seems to be only a problem on linux, so if worst comes to worst we coudl make it conditional for now... What exactly happens on OS X if you load a video and then set the format? What are the reasons to do this? Or better how should it behave?

@ofTheo
ofTheo Aug 31, 2012 collaborator

We should def make it not set pixel format after video is loaded.
I'll try that for now and then lets see if that fixes things.

@obviousjim
obviousjim Aug 31, 2012 collaborator

Perhaps it should force the internalPixelFormat back to whatever the video reports, and ofLogError()?

@bilderbuchi
bilderbuchi Aug 31, 2012 openFrameworks member

forgive my superficial reading of the code, but can't we just add an additional if !isLoaded() clause inside the first if, and setPixelFormat(), and else ofLogWarning() (or Error) that the pixelformat has not been set because the video has been loaded already?

@ofTheo
ofTheo Aug 31, 2012 collaborator

yup - I just did that. but unearthing some more potential bugs.
this is like a gift that keeps on giving ;)

@obviousjim
obviousjim Aug 31, 2012 collaborator

^_^ sorry guys! lmk if i can help at all @ofTheo

On Fri, Aug 31, 2012 at 10:29 PM, Theodore Watson
notifications@github.comwrote:

In libs/openFrameworks/video/ofVideoPlayer.cpp:

@@ -21,6 +22,9 @@ void ofVideoPlayer::setPlayer(ofPtr newPlayer){
//--------------------------------------------------------------------
void ofVideoPlayer::setPixelFormat(ofPixelFormat pixelFormat) {
internalPixelFormat = pixelFormat;

  • if( player != NULL ){
  •   player->setPixelFormat(internalPixelFormat);
    
  • }

yup - I just did that. but unearthing some more potential bugs. this is
like a gift that keeps on giving ;)


Reply to this email directly or view it on GitHubhttps://github.com/openframeworks/openFrameworks/pull/1478/files#r1506496.

  • James
@underdoeg
Collaborator

I just tried the example as well on ubuntu 12.04 64bits and I have the same behaviour as @bilderbuchi But it looks like this line is the culprit: https://github.com/openframeworks/openFrameworks/pull/1478/files#L25L23

@bilderbuchi
Member

thanks for the repro Philip!

@ofTheo
Collaborator
ofTheo commented Aug 31, 2012

thanks guys - crazy that broke so much :)
I'm going to try and fix this across all the players as its something we should do anyway ( implement set / get pixels formats ).

Might need some help with testing!

@ofTheo
Collaborator
ofTheo commented Aug 31, 2012

@obviousjim - was the intention that 10.6 users also use QTKitPlayer ?
I noticed that ofQuickTimePlayer was removed from the xcode project.

@obviousjim
Collaborator

Yes that was the intention. There weren't any missing legacy features
which is why we decided to keep the 10.6 quicktime grabber

however for safety in the transition, thinking about it now, a 10.6 compile
falling back to ofQuicktimePlayer wouldn't be a bad idea!

On Fri, Aug 31, 2012 at 10:21 PM, Theodore Watson
notifications@github.comwrote:

@obviousjim https://github.com/obviousjim - was the intention that 10.6
users also use QTKitPlayer ?
I noticed that ofQuickTimePlayer was removed from the xcode project.


Reply to this email directly or view it on GitHubhttps://github.com/openframeworks/openFrameworks/pull/1478#issuecomment-8191703.

  • James
@ofTheo
Collaborator
ofTheo commented Aug 31, 2012

Hmm. I'm getting this error when trying to play this movie: http://sourceforge.net/projects/casparcg/files/Sample_Videos/PAL_SD_16.9_Anamorphic_Progressive_QT_PNG_None_100_48kHz1.mov/download

2012-08-31 10:58:58.810 videoPlayerExampleDebug[37875:a503] *** CVOpenGLTextureCache: Forced to manually upload an IOSurface backed pixel buffer because it uses a non-native pixel format. Break on CVOpenGLTextureCache_IOSurfaceNotInNativePixelFormat() to debug.

Plays without error on QT7 player.

@obviousjim
Collaborator

Do you get that error and no pixels?

On Sat, Sep 1, 2012 at 12:05 AM, Theodore Watson
notifications@github.comwrote:

Hmm. I'm getting this error when trying to play this movie:
http://sourceforge.net/projects/casparcg/files/Sample_Videos/PAL_SD_16.9_Anamorphic_Progressive_QT_PNG_None_100_48kHz1.mov/download

2012-08-31 10:58:58.810 videoPlayerExampleDebug[37875:a503] ***
CVOpenGLTextureCache: Forced to manually upload an IOSurface backed pixel
buffer because it uses a non-native pixel format. Break on
CVOpenGLTextureCache_IOSurfaceNotInNativePixelFormat() to debug.

Plays without error on QT7 player.


Reply to this email directly or view it on GitHubhttps://github.com/openframeworks/openFrameworks/pull/1478#issuecomment-8194870.

  • James
@ofTheo
Collaborator
ofTheo commented Aug 31, 2012

@obviousjim hard to tell - I think maybe no pixels. But it it is weird because it implies its using non-OF textures. Or is just how CV works? ( ps - this was with the regular video player example ).

@ofTheo
Collaborator
ofTheo commented Aug 31, 2012

@obviousjim another question

should the osxHighPerformanceVideoPlayerExample flickr black frames every now and again? is that the asynchronous issue you mentioned?

@obviousjim
Collaborator

re 1) I'll test the video on my end, its an error most likely from forcing RGB pixel format, may doesn't work with that codec?

re 2). but there shouldn't be any flickering - what mode are you opening the video in? is it the same video?

@ofTheo
Collaborator
ofTheo commented Sep 1, 2012

re: 2) just running the stock osxHighPerformanceVideoPlayerExample no modifications. I see black frames, it seems that I see the black frame briefly and then the texture comes in. especially easy to see with frame by frame mode.

could be a 10.6.8 thing?

@obviousjim
Collaborator

which mode are you in? i can't remember the stock one, probably
texture-only?

It could be a frame by frame thing. does changing Synchronous Scrub on or
off affect the behavior?

On Sat, Sep 1, 2012 at 9:43 PM, Theodore Watson notifications@github.comwrote:

re: 2) just running the stock osxHighPerformanceVideoPlayerExample no
modifications. I see black frames, it seems that I see the black frame
briefly and then the texture comes in. especially easy to see with frame by
frame mode.

could be a 10.6.8 thing?


Reply to this email directly or view it on GitHubhttps://github.com/openframeworks/openFrameworks/pull/1478#issuecomment-8212566.

  • James
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment