I believe I have found a solution. I implemented a strategy where my circular buffer keeps waiting until a new frame arrives, similar to a blocking socket read operation. Then everything goes well.
Then I do not need to worry about the variables anymore, just like expected from a DeviceSource. Best regards, Flavio On Thu, 18 Apr 2024 at 16:07, Flavio Alves < flavio.al...@vitalintelligencedata.com> wrote: > Hello, > > I'm working on a live streaming service using Live55 using an Nvidia > Jetson Nano board. > > I'm capturing the frames from an USB webcam and I am encoding using > Nvidia's hardware encoder. Then Live555 is responsible for streaming this > capture using RTSP. > > The Nvidia software api to use the encoder uses some threads, and I was > unable to use it in a single application. I implemented 2 applications: the > RTSP server and the Capture application. > > The communication between them is shared memory, in Linux. I implemented a > circular buffer on this shared memory to place the encoded frames, which > are seen by the RTSP server application. > > I created custom classes for MediaSession (from > OnDemandServerMediaSubsession) and DeviceSource (from FramedSource). > > The software is almost working. I had issues with adding SPS and PPS and > timestamps on the encoded frame which seems to be fine. > > But what is happening now is that the applications seems to be ignoring > the fDurationInMicroseconds and/or the fPresentationTime values ... and the > framerate does not seem to be respected when streaming. The video showing > up is very fast. > > I would like to ask for any advice about how to properly address this > problem. > > Best regards, > > Flavio >
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel