-
-
Notifications
You must be signed in to change notification settings - Fork 260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transcoding, modifying frames and streaming over RTP #108
Comments
Another minor related question: |
1), 2) No idea really. Ask the vlc developers. I have no use cases for, and therefore limited experience with, direct rendering myself so I can't offer much more help. |
Thanks for the reply. Regarding (4): Thanks for explaining the MediaListPlayer trick. Works perfectly. |
Hi, |
I was thinking about something similar recently: it would maybe interesting to write a vlc plugin that created a JVM and ran some arbitrary Java code to process audio or video. I say "interesting", but it never got interesting enough for me to have a go at it myself. |
BTW, this is a question I posted on the VLC forum: It also contains more information about my real use-case. |
I would skip that "create a JVM" part and just let VLCj create the process, load the plugin and hook a special kind of "Player" to it similar to DirectPlayer. I`m using VLCj for augmented reality from h264 WebCAM streams with GPU processing where I for example count the number of persons and currently I produce a MJpeg stream out of Java which is again transcoded to h264 (else it is way too much data). This consumes way too much resources (all streams are HD quality) and you need a server class machine only for the transcoding part. If I could make something like romix proposed in 2) maybe I could move transcoding to the same machine as well because I can leave out the JPEG encoding part (which is crap because it affects quality). |
Interesting, but moot. It's out of my scope even though I'd like to see it. |
hi, its very similar to this |
That clip is very impressive. |
This is really a native issue, so I'm closing the issue. |
I`m thinking more and more about giving the plugin solution a try. Does anybody have an idea if audio would get out of sync with the video stream if for example from time to time a single video frame takes significantly longer to "process"? What I would like to achieve is to have a h264 stream with dynamically blended in information, logo, FPS, etc. I would like to produce the blend-in graphics using Java and merge a video frame with the Java produced bitmap and let vlc transcode it back to h264. But how do I deal with audio? If the processing inside Java takes little longer than it should, will audio get out of sync? |
This is probably the wrong place for that sort of question. But going on past experience you'll also struggle to get a useful answer at the videolan forums. |
Just to round this to a final state. There is a similar plugin for VLC which can do what we need. It After a hour or two spend debugging, I finally got it to work and there is one little problem with it which is easily fixable in my opinion. It does not give you the internal VLC buffer, it copies the data from the internal buffer to your own buffer using memcopy. I can`t say if this is essential to proper working but what I can see is that after your callback is finished, the original (not the buffer you can modify) is passed further in the chain. So if we could modify "smem" that it (first start - no optimization) copies your buffer over the original frame, I think we could get much closer to what we want to achieve. Right now, I http://forum.videolan.org/viewtopic.php?f=32&t=105142&p=356202&hilit=smem#p356202 and if I am not successful there, I will try to clone and adapt "smem". |
Any progress on this? |
I'm not sure who your question is directed at, but I am not doing or planning doing anything on this. |
Yes and No. I got your read media working in gStreamer. It needed some fiddling with queues and encoder parameters but now it works like a charm. I`m able to modify frames and stream 1080p to justin.tv or VLC. The plus point is that I didn`t need to refresh my c-skills at all (well actually yes to understand some gStreamer examples). If you plan to implement it also, remember to turn off B-Frames for h264 encoder for the start or use the zerolatency tuning option (easy in gStreamer, dunno in VLC). You will make your life much easier without B-Frames (hint: unordered processing). |
@holtakj Cool! Do you plan to contribute the code? |
Maybe this discussion could go to a more appropriate place, it's no longer vlcj related is it? |
@caprica not vlcj related this will be my last post to this issue @romix soonest open-sourcing release date would be realistically spring 2013. Its not that much code/work but I got it tight coupled and I cant open-source the whole project right now as it`s unfinished proof-of-concept with unclear licensing issues. I need to strip an extra module out of this functionality and currently I have no plan how to do it in a proper way. It`s not that hard to implement, it took me about a day or two to get it working properly. I can help you develop your own solution by giving you tips and hints. Gimme a way how to contact you. My mail is visible on github. |
Hi @holtakj , please give us more details about your implementation. I would like to create an open source library to do exactly this kind of work (Modify media frames and stream result in real time over RTP). |
@holtakj - 👍 here for open sourcing since I'm looking at the same problem. |
Hi,
I'd like to be able to take the source media file, modify each frame based on certain conditions, transcode it into H264 and then stream it over RTP.
Without frame transformations, it works just fine with a HeadlessMediaPlayer.
But since I need to modify frames, I thought I need to use DirectMediaPlayer.
I created one and provided my custom RenderCallback to it (the callback would convert VLC memory buffer into a java Image, process it using AWT classes, store it back into VLC memory buffer).
But it does not work as I expected. If I use these options (the same I used before with headless):
:sout=#transcode{fps=25,vcodec=h264,vb=0,acodec=none,scale=1.0}:rtp{dst=127.0.0.1, port=33377, mux=ts}
I see that my media is being transcoded and streamed according to it, but my callback is not invoked at all!
If I do not add those options, then my callback is called, but no transcoding is done and nothing is streamed over RTP.
Questions:
Is it a bug that a callback is not invoked when streaming?
Should it be possible to have a pipleline like this:
read media
modify each frame (e.g. draw something on top of the original frame content)
transcode the frame (e.g. into H264)
stream over RTP
Or may be I'm doing something wrong?
The text was updated successfully, but these errors were encountered: