Hi,
You cannot do this using pipe. You need to pass both streams to libavcodec libs as separate streams (synchornized). Then you can encode them and MUX them into FLV mux to pass it to RTMP server.
Marcin

W dniu 2014-10-13 18:29, Muhammad Ali pisze:
My objective is to use OpenRTSP to receive audio + video stream from IP camera and pass it on to FFMPEG which can then stream it to an RTMP server.

I've been using pipe to send stdout to ffmpeg (pipe input source) but that was only a video stream (OpenRTSP -v flag). Now the requirement has come to stream both the audio and video streams. So ofcourse I tried to replace -v with -4 and obviously it failed as they are two separate streams and not a single elementary stream. Am i correct ?

So now, what will be the correct way to achieve my objective. I myself am a developer and am not shy to code but I prefer to use something that is already around (if there is).

--
Muhammad Ali
And Or Logic
www.andorlogic.com <http://www.andorlogic.com>


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to