Hi Ross,

1. I know that is difference between presentation time and timestamp. I will be typing more accurately next time. 2. My encoder generates NALs with synchronization four bytes included at the beginning.

My first NAL analyzed by bitstream analyzer.

/!! Found NAL at offset 4 (0x0004), size 18 (0x0012)
XX *00 00 00 01* 67 42 80 1E 95 A0 50 7C 84 00 00 0F

==================== NAL ====================
 forbidden_zero_bit : 0
 nal_ref_idc : 3
 nal_unit_type : 7 ( Sequence parameter set )
======= SPS =======
 profile_idc : 66
 constraint_set0_flag : 1
.
.
.

/It is not discrete NAL unit.

3. My implementation is based on testOnDemandRTSPServer. I added my subclasses according to advices founded here:
http://www.live555.com/liveMedia/faq.html#liveInput-unicast

I created my class H264VideoMemBuffServerMediaSubsession that inherits from H264VideoFileServerMediaSubsession

/FramedSource* H264VideoMemBuffServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
    estBitrate = 750;

    DeviceParameters params;
    DeviceSource* fileSource = DeviceSource::createNew(envir(), params);
    if (fileSource == NULL) return NULL;

    return H264VideoStreamFramer::createNew(envir(), fileSource);
  }/

My changes in ./testOnDemadnRTSPServer.cpp
was:
/sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, inputFileName, reuseFirstSource));/
I changed to:
/sms->addSubsession(//H264VideoMemBuffServerMediaSubsession//::createNew(*env, inputFileName, reuseFirstSource));/

My DeviceSource.cpp implementation:

void DeviceSource::doGetNextFrame() {
*  // usleep(110000); //live camera simulation*
    deliverFrame();
}

void DeviceSource::deliverFrame() {
  if (!isCurrentlyAwaitingData()) return;

// Updating pointer to new data and updating its size.
   // newFrameSize =
  //  newFrameDataStart =



  if (newFrameSize > fMaxSize) {
    fFrameSize = fMaxSize;
    fNumTruncatedBytes = newFrameSize - fMaxSize;
  } else {
    fFrameSize = newFrameSize;
  }

gettimeofday(&fPresentationTime, NULL); // If you have a more accurate time - e.g., from an encoder - then use that instead.

     printf("Sending %u bytes", fFrameSize);
     if( fNumTruncatedBytes )
         printf(" with truncated %u bytes", fNumTruncatedBytes);
printf("\tPresentation time: %u.%06u\n", (int)fPresentationTime.tv_sec, (unsigned)fPresentationTime.tv_usec );

  memmove(fTo, newFrameDataStart, fFrameSize);
  FramedSource::afterGetting(this);
}




4. I did one test. I had gathered in memory buffer more NAL units ( about 100 ). I was passing them without any delays. I have received 25 fps in my video player and testRTSPClient was printing presentation time correlated with real time. There was not big difference between presentation time on printed on server and on client.
I next test I have added  one line to doGetNextFrame function:
usleep(110000);
This line simulated waiting time on next frame in live camera. In this case, problem with presentation time was appeared again.

START: Wall clock: 14:24:00
    Server:
        Sending 2993 bytes      Presentation time: 1459773496.311242
Client:
        Received 2989 bytes.    Presentation time: 1459773496.411603

Presentation time difference at the beginning of the transmission:*~100ms*

STOP: Wall clock: 14:25:40, 100 seconds elapsed:
    Server:
        Sending 10577 bytes     Presentation time: 1459773595.847020
    Client:
        Received 10573 bytes.   Presentation time: 1459773532.290209

Presentation time difference at the end of the transmission: *~63 seconds

*5. You asked about truncation. Sometimes fMaxSize in DeviceSource is too small and I must truncate data. I cant pass more than fMaxSize. It's not up to me.

Thanks for any hint.

Paweł
*
*
On 01/04/16 17:55, Ross Finlayson wrote:
I have compared logs from my rtsp server ( runned on embedded device) and rtsp 
client ( runned on PC) and I found difference in timestamps.
You mean “presentation time”, not “timestamp”.  (Once again, you do not need to 
concern yourself with RTP timestamps - at all.  Please don’t mention them 
again; that inevitably turns the discussion into a ‘rat hole’, and wastes my 
time.)


On the server has elapsed 4.7 s
During which time you transmit 46 NAL units.  Which implies a frame rate of 
(46-1)/4.7 =~ 9.5 frames per second.  That seems about right.

On the testRTSPClient has elapsed 1.6 s.
More accurately, the computed *presentation times” span 1.6s.  (I’m sure that 
the actual elapsed time is about 4.7s)

These symptoms suggest that your server is not computing presentation times 
properly.

But you haven’t said anything about specifically how you generate H.264 NAL 
units, and feed them to your “H264VideoRTPSink”.  I suspect that this is your 
real problem.

If you are generating ‘discrete’ H.264 NAL units - i.e., one at a time - then 
you must feed them into a “H264VideoStreamDiscreteFramer”, NOT a 
“H264VideoStreamFramer”.  Also, the H.264 NAL units that you feed into a 
“H264VideoStreamDiscreteFramer” MUST NOT begin with a 4-byte 0x00 0x00 0x00 
0x01 ‘start code’.

(You use a “H264VideoStreamFramer” only if your video comes from an 
unstructured H.264 byte stream - e.g., from a file.  It parses the byte stream 
and computes its own “fPresentationTime” and “fDurationInMicroseconds”.  
Therefore, it’s not what you want.)

Please send us your implementation of the “createNewStreamSource()” and 
“createNewRTPSink” virtual functions.  This may tell us what’s wrong.

Also:
PIPE:LOS: Sending 4137 bytes with truncated 7685 bytes  Presentation time: 
567993728.084010
What is this ‘truncation’?  This is something that you will need to fix.  Your 
“H264VideoStreamDiscreteFramer” object must be fed a complete, untruncated 
H.264 NAL unit.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/


_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

--
Paweł Domagalski
Software Engineer
Mobica Ltd.
www.mobica.com

Mobica is a provider of innovative, cutting-edge software engineering, testing and 
consultancy services. Based in the UK, Poland, the USA and Mexico, Mobica has a 
worldwide customer base and a proven track record in delivering complex solutions 
to global leaders in a range of sectors including automotive, mobile, 
semiconductor, finance, TV & broadcasting, telecommunications, connected 
devices, marine and aviation.

Mobica Limited is a limited company registered in England and Wales with 
registered number 05169596 and VAT registered number 845117630. Our registered 
office is at Crown House, Manchester Road, Wilmslow, Cheshire, SK9 1BH, UK.
This message is intended solely for the addressee(s) and may contain confidential information. If you have received this message in error, please send it back to us, and immediately and permanently delete it. Do not use, copy or disclose the information contained in this message or in any attachment.

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to