Re: [Live-devel] RTCP and synchronization

2007-07-11 Thread Julian Lamberty
Ok, but I still don't get how the final recipient gets to know the *original* presentation time (which correspond to audio) if I generate a new one in the transcoder... Is this done via RTCP?! If yes, could you shortly explain it, please? Julian ___ liv

[Live-devel] High peaks in processing times

2007-07-11 Thread Julian Lamberty
Hi! I've developed a class to transcode video streams. I measure the time between "doGetNextFrame()" and "afterGetting(this)" with the gettimeofday() function. The results show that it takes in average 40ms to complete one frame (according to the frame rate of 25Hz). But irregularily one frame

Re: [Live-devel] RTCP and synchronization

2007-07-11 Thread Julian Lamberty
> Because they're not useful to you. > > I'll say this yet again (and hopefully for the last time): Our > RTP/RTCP implementation automatically computes synchronized > presentation times using RTP timestamps and RTCP reports. Code that > receives a RTP stream (using our library) should never h

[Live-devel] RTCP and synchronization

2007-07-04 Thread Julian Lamberty
Hi! As RTCP SR packets carry NTP ans RTP timestamps I would like to know how they are used to sync to streams. I found out that the RTP timestamp in the RTCP SR corresponds to the timevalue which the NTP timestamp indicates. But why can't I find the RTP timestamp in the packets of my videostrea

Re: [Live-devel] presentationTime and B-Frames

2007-07-02 Thread Julian Lamberty
Ross Finlayson schrieb: >> How is the timeval structure mapped onto the 32bit presentation time in >> the RTP packet, what interval do I have to add to fPresentationTime to >> make RTP timestamp increase by 3600 (i.e. one frame duration @25Hz)? >> > > Receiving code should not need to care abo

Re: [Live-devel] presentationTime and B-Frames

2007-07-02 Thread Julian Lamberty
> Note that In your original MPEG-2 stream, with B-frames, the frames > will be in decoding order, which is different from the display order, > and thus different from the order of frames in the resulting MPEG-4 > stream (because that doesn't have B frames). In particular, the > presentation

[Live-devel] RTCP question

2007-06-29 Thread Julian Lamberty
I've set up an RTCPInstance for my Sink exactly as shown in the testprogs, but there are no RTCP packets sent. Also the vobStreamer just sends one single RTCP packet. Why is that? Shouldn't there be more RTCP traffic? Julian smime.p7s Description: S/MIME Cryptographic Signature ___

[Live-devel] presentationTime and B-Frames

2007-06-29 Thread Julian Lamberty
Hi! I'm transcoding MPEG-2 (with B-Frames) into MPEG-4 (without B-Frames, just I and P). Currently I'm just transcoding video, but later the transcoded streams should by able to be synchronised with the source's audio stream: > Audio |

Re: [Live-devel] FramedFilter Performance & Questions

2007-06-27 Thread Julian Lamberty
Another interesting effect is that the average total processing time for one frame (measured from the start of doGetNextFrame() until after afterGetting(this)) is always close to 40ms, no matter how long my transcoder needs to process one frame (assuming that this time stays below ~35ms, which

Re: [Live-devel] FramedFilter Performance & Questions

2007-06-27 Thread Julian Lamberty
No. Assuming that your "afterGettingFrame()" function was passed as a parameter to "getNextFrame()", then it will be called by "FramedSource::afterGetting()" (which your "doGetNextFrame()" implementation should have called once it completed delivery of incoming data). So is my code correc

Re: [Live-devel] FramedFilter Performance & Questions

2007-06-27 Thread Julian Lamberty
Am I right assuming that my afterGettingFrame() function is called by the TaskScheduler? Sometimes it happens that afterGettingFrame() is called before the first call has completed. Could anyone explain, why sometimes I have a delay of >10ms between two calls to afterGettingFrame()? smime.p7

[Live-devel] FramedFilter Performance & Questions

2007-06-25 Thread Julian Lamberty
OK, I've investigated the problem further by adding timestamp at the beginning and the end of doGetNextFrame(), afterGettingFrame() and afterGettingFrame1(). As you can see from the code snippet in this thread, everytime my transcoder needs more data to complete the frame it calls fInputSource-

Re: [Live-devel] FramedFilter Performance & Questions

2007-06-24 Thread Julian Lamberty
If your "Transcoder" filter delivers discrete MPEG-4 frames, and sets "fPresentationTime" and "fDurationInMicroseconds" properly, then you don't need to insert a "MPEG4VideoStreamDiscreteFramer". It does, but if I leave out the framer, the sink doesn't even call doGetNextFrame() of my clas

[Live-devel] FramedFilter Performance & Questions

2007-06-23 Thread Julian Lamberty
Hi! I've implemented a FramedFilter subclass that transcoders video from MPEG-2 to MPEG-4 using the libavcodec library. Therefore I use a live-"chain" that looks like: MPEG1or2VideoRTPSource -> Transcoder (my class) -> MPEG4VideoStreamDiscreteFramer -> MPEG4ESVideoRTPSink The Transcoder's

Re: [Live-devel] PES packet sizes

2007-06-19 Thread Julian Lamberty
Hi Christian ;) I am having a few problems while parsing PES stripped from DVB-T MPEG-TS to the MPEGDemux Class. PES coming from DVB-T are indeed very peculiar. First of all, they can be very big. Some channels had PESs up to 150KB... so first question: is this correct? Can PES really be unbo

[Live-devel] vobStreamer questions

2007-06-15 Thread Julian Lamberty
Hi! I would like to know how vobStreamer respectively MPEG1or2VideoRTP Source encapsulate the VideoStream from a DVD into the RTP Packets. Does it extract the Elementary Streams from the Program stream and pack the Elementary streams according to RFC2250? Why does Wireshark identify the RTP

Re: [Live-devel] doGetNextFrame()

2007-06-09 Thread Julian Lamberty
Note that the error about "atempting to read more than once" occurs if you call "getNextFrame()" on the *same* object a second time, before the 'after getting' function from the first call has been invoked. I.e., it's OK to do: source->getNextFrame(..., afterGettingFunc, ...);

[Live-devel] doGetNextFrame()

2007-06-08 Thread Julian Lamberty
> Are you sure that your "Transcoder" code is always correctly setting "fFrameSize" before calling "afterGetting(this)"? OK, I found the problem I think, but now I need support: I made a call to afterGetting(this) even if there was no encoded frame available. I changed that now, but there is a

[Live-devel] afterGetting(this)

2007-06-08 Thread Julian Lamberty
Hi! I would like to know how I can 1. Request more data from a source (MPEG1or2VideoRTPSource) 2. Tell a sink (MPEG4VideoStreamDiscreteFramer) that data is completely delivered to buffer (this should be afterGetting(this), right?) Right now I have a code structure like: void Transcoder::doGe

[Live-devel] doGetNextFrame()

2007-06-08 Thread Julian Lamberty
Hi! I've added some stdout messages to MPEG4VideoStreamDiscreteFramer and I can see that there are many calls to doGetNextFrame() and afterGettingFrame1() even if my source did not deliver one frame: MPEG4VideoStreamDiscreteFramer::doGetNextFrame() passed MPEG4VideoStreamDiscreteFramer::after

[Live-devel] Problem Streaming MPEG4ES from buffer

2007-06-06 Thread Julian Lamberty
Thanks for your reply, I changed my code. Nevertheless, enc_bytes did never exceed fMaxSize in my program before and thus the problem still exists. VLC, which I use to play the stream, reports loads of errors regarding late pictures and damaged headers... :( Any more ideas? Thank you! Julian

[Live-devel] Problem Streaming MPEG4ES from buffer

2007-06-06 Thread Julian Lamberty
Hi Severin! How did you do that, I'm trying exactly the same. But when I use MPEG4VideoStreamFramer instead of the discrete one I get errors that say: StreamParser::afterGettingBytes() warning: read 23329 bytes; expected no more than 10026 MultiFramedRTPSink::afterGettingFrame1(): The input f

[Live-devel] Problem Streaming MPEG4ES from buffer

2007-06-06 Thread Julian Lamberty
If I dump the stream at the receiver with openRTSP I also get a corrupted stream, so it's not a problem with the player I'm using (VLC with live555 support). smime.p7s Description: S/MIME Cryptographic Signature ___ live-devel mailing list live-devel@

[Live-devel] Problem Streaming MPEG4ES from buffer

2007-06-06 Thread Julian Lamberty
Hi! I've a ffmpeg code that writes MPEG4 frames into a buffer. I want to stream these frames over RTP/RTSP. At the moment I pass the buffer to an "MPEG4VideoStreamDiscreteFramer" that sends it to an "MPEG4ESVideoRTPSink". But the stream I receive is totally corrupted. Using wireshark I can see

[Live-devel] MPEG4VideoStreamDiscreteFramer

2007-06-04 Thread Julian Lamberty
Hi! I have an encoder that writes MPEG4 frames to a buffer which I want to stream out with MPEG4ESVideoRTPSink. I'm using the MPEG4VideoStreamDiscreteFramer class in between. The problem is that the stream is corrupted, but if I write the buffer to a file like "test.m4e" it can be played cor

[Live-devel] FramedFilter questions

2007-06-03 Thread Julian Lamberty
As you already guessed, I did that ;) I took MP3ADUTranscoder as an example where presentation time and duration are just passed through. I don't get why my code does not work. When I write one complete frame to fTo, MPEG4VideoStreamDiscreteFramer should be able to pass them correctly to MPEG4

[Live-devel] FramedFilter questions

2007-06-02 Thread Julian Lamberty
Hi! I've some questions related to the FramedFilter class. I subclassed it to transcode a MPEG2 Stream to MPEG4. So the structure of my program looks like: MPEG1or2VideoRTPSource -> Transcoder (my class) -> MPEG4VideoStreamDiscreteFramer -> MPEG4ESVideoRTPSink My Transcoder class reads dat

[Live-devel] UDP checksum wrong

2007-06-01 Thread Julian Lamberty
Sorry, my fault ;) You should not run wireshark on the same computer that sends the packets... smime.p7s Description: S/MIME Cryptographic Signature ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live

[Live-devel] UDP checksum wrong

2007-06-01 Thread Julian Lamberty
Hi! Since my transcoder now transcodes I have a new problem: I deliver complete MPEG4 Frames to MPEG4VideoStreamDiscreteFramer followed by MPEG4ESVideoRTPSink. But all the packets sent have a wrong UDP checksum. Wireshark reports that a lot of subsequent UDP packets have the SAME checksum. For

[Live-devel] StreamParser::afterGettingBytes() warning

2007-05-30 Thread Julian Lamberty
OK, I found the solution to *this* problem ;) I'm now using MPEG4VideoStreamDiscreteFramer and get no more errors reported. Thank you anyway! smime.p7s Description: S/MIME Cryptographic Signature ___ live-devel mailing list live-devel@lists.live555.c

[Live-devel] StreamParser::afterGettingBytes() warning

2007-05-30 Thread Julian Lamberty
Humm, actually it should not: if(enc_bytes > fMaxSize) { fFrameSize = fMaxSize; fNumTruncatedBytes = enc_bytes - fMaxSize; } else { fFrameSize = enc_bytes; fNumTruncatedBytes = 0; } smime.p7s Description: S/MIME Cryptographic Signature ___

[Live-devel] StreamParser::afterGettingBytes() warning

2007-05-30 Thread Julian Lamberty
Hi! When I try to stream out MPEG4 content using MPEG4VideoStreamFramer and MPEG4ESVideoRTPSink I frequently get errors that say: StreamParser::afterGettingBytes() warning: read X bytes; expected no more than Y (where X is fFrameSize). What am I doing wrong? smime.p7s Description: S/MIME

[Live-devel] Read multiple frames?

2007-05-28 Thread Julian Lamberty
Therefore, if you are feeding input from a "MPEG1or2VideoRTPSource" into a decoder, and your decoder is not smart enough to decode one slice at a time, then you must aggregate the input data into complete video frames before feeding them to your decoder. Can this be done with live555 stuff? J

[Live-devel] Read multiple frames?

2007-05-27 Thread Julian Lamberty
Hi! I've subclassed FramedFilter to transcode a MPEG2 Stream (from vobStreamer) to MPEG4. My "live-chain" is: MPEG1or2VideoRTPSource -> Transcoder -> FileSink I use the following code: void Transcoder::doGetNextFrame() { fInputSource->getNextFrame(fOrg, 4096, afterGettingFrame, this, han