I'm using an Elgato DVB-T USB tuner (EyeTV for DTT) in combination with the livemedia library to stream a MPEG2-TS stream containing h264 video and AAC audio (LATM encapsulation) via RTSP over my LAN. I've based my streaming server on the testMPEG2TransportStreamer sample.
Does the unchanged "testMPEG3TransportStreamer" demo application work correctly when you stream a Transport Stream *file* that you previously recorded from your capture device? It should, but if - for some reason - it doesn't, then do not proceed to step 2 :-)
The EyeTV plugin SDK provides a callback which is activated when ~100 or so packets have arrived, and sends the raw TS data via a pipe to the server code, running in another thread. I've modified the code to read from the other end of the pipe I created. I've also removed the MPEG2Framer from the chain as I presumed the packet stream was already in this format.
No, the "MPEG2TransportStreamFramer" also has the job of (1) accumulates the incoming Transport Stream packets into groups of 7, so that they end up getting packed into outgoing RTP packets efficiently, and (2) computing the presentation time and duration of each Transport Packet, so that the outgoing RTP packets get transmitted with appropriate spacing between them. You should continue to use the MPEG2TransportStreamFramer" object.
-- Ross Finlayson Live Networks, Inc. http://www.live555.com/ _______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel