I am a developer interested in getting video to work. What are the technical hurdles that need to be jumped here? In a offline thread comex implied that this would need a heavy re. I am busy looking at the flash code, but thought a online discussion of the issues would help get me (and others) up to speed.
On the Android side, OMX* in stagefright is barely documented. On the iOS side, there is some codec/acceleration stuff that's completely undocumented, except that apparently old 4.0 SDK betas included headers by mistake. As far as I know, video support is a matter of tying those together.
so I presume the error "Could not find _ZN7android11MediaBufferC1Ej" would have something to do with this?
Yeah, there's that but also virtual methods etc.
Hmm, I got distracted here looking at gnash again. I actually got a youtube video to play now, but this is with SW decode. Where does Frash get its good perf? is it actually using HW accelerated SVG stuff? Or is all the flash rendered in SW?
In software (this is not my choice; the Android pluton doesn't use the canvas API for regular drawing). Of course, it will be a much greater difference when it comes to video. Can I see your code? or are you referring to playing video with Gnash?
sorry, that is in gnash. When I saw your IOSurface work, I saw a new avenue to possibly use gnash and because it was close, I thought I would try get that a bit further. But perf is a problem. Just using SW decode for movies. But this should be fast enough at 320x240, so I don't know what the bottle neck is here.
hmm, looks like I need to start using the neon framework. Luckily there is http://code.google.com/p/ffmpeg4iphone this effort. Which seems to get some good perf... lets see if I can leverage that.
I don't know if this is useful, but I wondered if there is a way to have a look at how Flash Packager for iPhone apps handle video decode.
For Instruments, is it possible to use it with out having the correctly signed entitlements? I have not found anyone how has done this. Otherwise I would have to convert to using Xcode?
OK, I made some interesting progress/research. I compiled the android source into frash (at least what it was looking for) so I have eliminated and "could not find" messages. Now, the adobe lib is hitting the "query codec paths". The one bigger problem is that the android source uses OpenBinder to link the different modules together (e.g.
sp sm = defaultServiceManager();
sp binder = sm->getService(String16("media.player"));
sp service = interface_cast(binder);
Now I guess we can bypass this altogether once we know which interface classes to use. The next step then is too plug in the codecs from a OpenMax implementation. Bellagio provides a reference implementation that uses ffmpeg for software codecs (ideally uses actual HW codecs, but for now SW is ok...).
Question: Some youtube videos do work, but the video is only visible on the top half of the screen, know why?
Youtube may be setting noscale or doing other strange things with it's own dimensions at runtime (which is why it's difficult to embed a youtube video in another swf).
You may need to fix NOSCALE support to fix the youtube position problem: https://github.com/comex/frash/issues#issue/14
Of course, that's just a guess.