Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-29 Thread Tony
If I did as you said. VLC always said the input buffer is empty. At 2013-10-26 12:05:12,"Ross Finlayson" wrote: nextTask() = envir().taskScheduler().scheduleDelayedTask(2,(TaskFunc*)FramedSource::afterGetting, this); [...] nextTask() = envir().taskScheduler().scheduleDelayed

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-25 Thread Ross Finlayson
> nextTask() = > envir().taskScheduler().scheduleDelayedTask(2,(TaskFunc*)FramedSource::afterGetting, > this); [...] >nextTask() = > envir().taskScheduler().scheduleDelayedTask(8000,(TaskFunc*)FramedSource::afterGetting, > this); This is wrong. Once you've delivered a frame of dat

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-25 Thread Tony
But if I use the vlc to play it, the audio's pts is out of range, and it's dropped. but it works by ffplay. I don't know where is wrong. Could you help me to check it where is wrong. I'm crazy caused by this problem. Here is my code. audio is g711, each buffer is 160 bytes. void AudioFrameSource::

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-24 Thread Ross Finlayson
> I saw the FAQ in live555 website. > It said the live555 sync the audio and video by RTCP's SR packets. > So I should create RTCP instance for each RTP source explititly? No, because (assuming that you are controlling the streaming using RTSP) this is done implicitly. (In the RTSP server, this

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-24 Thread Tony
I saw the FAQ in live555 website. It said the live555 sync the audio and video by RTCP's SR packets. So I should create RTCP instance for each RTP source explititly? At 2013-10-24 05:48:40,"Ross Finlayson" wrote: So how should I set the duration, then the audio and video would be sync. Yo

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-23 Thread Ross Finlayson
> So how should I set the duration, then the audio and video would be sync. You don't. The way you ensure that your audio and video streams are in sync is by having your server give each (audio and video) frame accurate presentation times - i.e., in the setting of "fPresentationTime" by the (au

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-22 Thread Tony
Thanks your answer. There is another question about the scheduleDelayedTask(duration,x,x). So how should I set the duration, then the audio and video would be sync. Currently my every audio's frame is 2ms. the video frame rate is 25fps.. Now I set the audio's next getframe time is 2ms, vid

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-20 Thread Ross Finlayson
>In the videoframesource’s getnextframe, if the buffer is nalu, not > completely frame. So the fPresentationTime and fDurationInMicroseconds should > only be set when the buffer is the last nalu in current frame. > Is it right? Not quite. "fPresentationTime" should be set for every

[Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-18 Thread 梦幻工作室
HI, In the videoframesource’s getnextframe, if the buffer is nalu, not completely frame. So the fPresentationTime and fDurationInMicroseconds should only be set when the buffer is the last nalu in current frame. Is it right? ___ live-dev