Hello,



I am using FFMpeg libavformat library to generate a webm file from RTP video 
and audio streams that I receive on my server. I receive already encoded 
streams. Video - vp8 encoded and audio - opus encoded.



I will have RTP timestamps for both the streams and NTP timestamps for the 
corresponding RTP packets from the RTCP. Now how do I set PTS and DTS of 
AVPacket with these values ? I parse the video byte stream to get frame wise 
byte array and I will have the timestamps mentioned above. So, while 
constructing AVPacket, how should I set the timestamps so that audio and video 
will be in sync properly in my final output file ?





Regards,

Ravi Kiran B S







_______________________________________________
Libav-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to