Thanks for the quick reply! Yes, I'm still developing an RTSP server. Below is my pipeline.
H264 Baseline 4.0 source (I and P frames only) -> ffmpeg (2.5) mpegts muxer -> custom FramedSource (based on DeviceSource) getting 1316 (188 * 7) byte output chunks from the mpegts muxer and writing all *chunks* for each H264 frame to `fTo`, setting fPresentationTime (via gettimeofday) for each H264 frame, not chunk. I'm not quite sure how to set `fDurationNanoseconds`. I tried to set it to each chunk and I get the same behavior as outline in the previous email. To give a bit of background on my sources, the H264 encoder is commercial but not widely used, ffmpeg (libavformat) is muxing h264 data directly into an mpegts bytestream without modification. If I dump the h264 data directly to a file - it works. If I use `testH264VideoStreamer` with the h264 data - it works. If I also dump the mpegts data directly to a file - it works, BUT when using `testOnDemandRTSPServer` with said mpegts file, there is a large 4-5 second startup delay when a client connects on BOTH the client (openRTSP) and the server. I figured out that this was coming from MPEG2TransportStreamFramer. Since I use my own FramedSource instead of MPEP2TransportStreamFramer, I only see this now in `openRTSP`. I assume this is due to something being incorrect with my mpegts bitstream - I just haven't been able to figure it out yet. I can upload an mpegts file directly from my muxer if that'll be helpful. (works == plays in all common players and shows no warnings/errors in the tools) Thanks again Ross. I'm probably missing something simple. Advice is much appreciated. -ac On Thu, Jun 4, 2015 at 10:29 AM, Ross Finlayson <finlay...@live555.com> wrote: > I have an MPEGTS muxer (ffmpeg) that spits out 1316 (188 * 7) byte chunks > of valid data at a time. In my FramedSource subclass, I can write this to a > file via fwrite and have a perfectly playable mpegts file. I've been > sending data to `fTo` in 1316 byte chunks (as hinted in our previous > conversation). The library gives no error or warning and it looks like the > data is getting sent out correctly looking at IO rates in wireshark. My > total rate is only about ~2Mbit/s, but `openRTSP` only writes it's output > file at ~160Kbit/s. > > I am curious of the proper "rate" to feed `fTo` for my MPEGTS > FramedSource. I figured <= MTU sized data would be best, but it seems maybe > something else. Ideas? > > > Can you describe some more how your system is structured? (I assume, > based on your previous message, that you’re still developing a RTSP server.) > > In particular, does your “createNewStreamSource()” implementation (in your > “OnDemandServerMediaSubsession” subclass) create only an instance of your > “FramedSource” subclass, or does it also feed this to a > “MPEG2TransportStreamFramer”? (If your “FramedSource” subclass is getting > data from a live input source (e.g., from an encoder), rather than from a > pre-recorded file, then you shouldn’t need to feed it to a > “MPEG2TransportStreamFramer”; however, in that case it’s important that > your “FramedSource” subclass set “fPresentationTime” and > “fDurationInMicroseconds” correctly, so that the downstream object (a > “SimpleRTPSink” in this case) will request data at the correct rate.) > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel@lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel >
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel