Re: [Live-devel] Framed Source Issue

2009-03-30 Thread Ross Finlayson
However, audio still gets called much more often than it needs to as per the fDurationInMicroseconds prameter. Remember, You Have Complete Source Code. I have told you *repeatedly* what you need to look at to figure out why your problem is happening. Why do you keep ignoring my advice?? T

Re: [Live-devel] Framed Source Issue

2009-03-30 Thread Mukherjee, Debargha
@ns.live555.com] On Behalf Of Ross Finlayson > Sent: Friday, March 27, 2009 3:41 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > > On Mar 27, 2009, at 3:22 PM, "Mukherjee, Debargha" > > wrote: > >

Re: [Live-devel] Framed Source Issue

2009-03-27 Thread Ross Finlayson
On Mar 27, 2009, at 3:22 PM, "Mukherjee, Debargha" > wrote: Thanks. How about the MPEG4 video? I am currently encoding video frames into MPEG4 and then using the MPEG4VideoStreamFramer class before feeding into MPEG4ESVideoRTPSink. Is that correct? No. Because your input source

Re: [Live-devel] Framed Source Issue

2009-03-27 Thread Mukherjee, Debargha
evel-boun...@ns.live555.com] On Behalf Of Ross Finlayson > Sent: Friday, March 27, 2009 2:42 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > > On Mar 27, 2009, at 9:28 AM, "Mukherjee, Debargha" > > wrote: > &g

Re: [Live-devel] Framed Source Issue

2009-03-27 Thread Ross Finlayson
On Mar 27, 2009, at 9:28 AM, "Mukherjee, Debargha" > wrote: Hi Ross, I am still somewhat confused. The parameter fDurationInMicrosecondss is being set correctly by me in the deliverFrame() function of my AudioEncSource class before the call to FramedSource::afterGetting(this). Could you poi

Re: [Live-devel] Framed Source Issue

2009-03-27 Thread Mukherjee, Debargha
ive555.com > [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson > Sent: Thursday, March 26, 2009 6:00 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > > >Also it seems to me that the MPEG1or2AudioStreamF

Re: [Live-devel] Framed Source Issue

2009-03-26 Thread Ross Finlayson
Also it seems to me that the MPEG1or2AudioStreamFramer class does not use the fDurationInMicroseconds parameter at all. Yes it does, it *sets* this parameter (see line 144). Once again, look at the parameters that are passed to each call to "void MultiFramedRTPSink::afterGettingFrame()". (The

Re: [Live-devel] Framed Source Issue

2009-03-26 Thread Mukherjee, Debargha
---Original Message- > From: live-devel-boun...@ns.live555.com > [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson > Sent: Thursday, March 19, 2009 4:58 PM > To: LIVE555 Streaming Media - development & use > Subject: Re: [Live-devel] Framed Source Issue > >

Re: [Live-devel] Framed Source Issue

2009-03-19 Thread Ross Finlayson
First, is it possible to explain using TaskScheduler::scheduleDelayedTask(), as you suggested, a little better? In your implementation of "doGetNextFrame()", if no input data is currently available to be delivered to the downstream object, then you could just call "TaskScheduler::scheduleDela

Re: [Live-devel] Framed Source Issue

2009-03-18 Thread Ross Finlayson
I am having an issue on the streaming-out side. The audio and video encoders read raw data from shared buffers using two derived FramedSource classes modeled after DeviceSource.cpp. The deliverFrame() function in these derived classes read raw audio and video from respective shared buffers, enc