But if I use the vlc to play it, the audio's pts is out of range, and it's 
dropped.
but it works by ffplay.
I don't know where is wrong.
Could you help me to check it where is wrong. I'm crazy caused by this problem.
Here is my code.
audio is g711, each buffer is 160 bytes.
void AudioFrameSource::doGetNextFrame()
{
    unsigned acquiredFrameSize=0;
    if(m_session!=NULL)
  {
      
m_session->GetNextAudioFrame((char*)fTo,fMaxSize,&acquiredFrameSize,&fNumTruncatedBytes);
      if(acquiredFrameSize!=0)
      {
          if(_isFirst)
            {
                m_session->GetTimeScale(&_timeval);
                _isFirst = false;
            }
            else
            {
                _timeval.tv_usec += 20000;
                if(_timeval.tv_usec>=1000000)
                {
                    _timeval.tv_sec ++;
                    _timeval.tv_usec -= 1000000;
                }
            }
            fFrameSize = acquiredFrameSize;
            fPresentationTime = _timeval;
           // fDurationInMicroseconds = 20000;
      }
  }

    nextTask() = 
envir().taskScheduler().scheduleDelayedTask(20000,(TaskFunc*)FramedSource::afterGetting,
 this);
}


video's getnextframe
each buffer is  h264 nalu, and the stream's average framerate is 25fps.
void VideoFrameSource::doGetNextFrame()
{
  unsigned int framesize=0;
  if(m_session!=NULL)
  {
      bool lastnalu=false;
      
m_session->GetNextVideoFrame(_firstframe,lastnalu,(char*)fTo,fMaxSize,&framesize,&fNumTruncatedBytes);
      fFrameSize = framesize;
      if(framesize!=0)
      {
          if(_firstframe)
          {
               m_session->GetTimeScale(&_timescale);
               _firstframe = false;
          }
          else if(lastnalu)
          {
                _timescale.tv_usec += 40000;
                if(_timescale.tv_usec>=1000000)
                {
                    _timescale.tv_sec ++;
                    _timescale.tv_usec -= 1000000;
                }
          }
        fPresentationTime = _timescale;
       // fDurationInMicroseconds = 40000;
      }
  }

   nextTask() = 
envir().taskScheduler().scheduleDelayedTask(8000,(TaskFunc*)FramedSource::afterGetting,
 this);
}



At 2013-10-24 16:35:56,"Ross Finlayson" <finlay...@live555.com> wrote:

I saw the FAQ in live555 website.
It said the live555 sync the audio and video by RTCP's SR packets. 
So I should create RTCP instance for each RTP source explititly?



No, because (assuming that you are controlling the streaming using RTSP) this 
is done implicitly.


(In the RTSP server, this is done when the stream starts playing; in the RTSP 
client, it is done in the implementation of the "initiate()" function.)



Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to