Hi Ross,
Having explored the issue further, I have narrowed down the problem, but not
quite solved it yet. Any hints or help would be appreciated.
I am using MP2 audio encoding for which the compressed framesize is supposed to
be 576 bytes (sampling rate is 32 KHz, single channel). However, oc
HI,
I am receiving uncompressed audio and video from a live Directshow source,
encoding them with ffmpeg and streaming them out using a RTSP server based on
classes derived from Live OnDemandServerMediaSubsession. Then I play the feed
on a remote machine using VLC player. The problem is that th
Hi Jeremy,
I am currently trying to attack a similar problem with the ffmpeg encoder
(which I believe also uses X264). While I am not an expert on live either, from
reading the archives this is what I have understood so far, and this is what I
would try next. You need to send NAL units to your
@ns.live555.com] On Behalf Of Ross Finlayson
> Sent: Friday, March 27, 2009 3:41 PM
> To: LIVE555 Streaming Media - development & use
> Subject: Re: [Live-devel] Framed Source Issue
>
> On Mar 27, 2009, at 3:22 PM, "Mukherjee, Debargha"
> > wrote:
>
>
evel-boun...@ns.live555.com] On Behalf Of Ross Finlayson
> Sent: Friday, March 27, 2009 2:42 PM
> To: LIVE555 Streaming Media - development & use
> Subject: Re: [Live-devel] Framed Source Issue
>
> On Mar 27, 2009, at 9:28 AM, "Mukherjee, Debargha"
> > wrote:
>
&g
Hi Ross,
I am still somewhat confused.
The parameter fDurationInMicrosecondss is being set correctly by me in the
deliverFrame() function of my AudioEncSource class before the call to
FramedSource::afterGetting(this). Could you point me to where in your code it
is actually used to decide when t
HI Ross,
Thanks for the pointers, but I am still struggling with the issue of audio
being called much more often than it needs to be.
Please find below my derived audio encoding class implementation.
My constructor takes in a structure with encoding parameters, along with two
pointers to extern
Hi,
What is the best way to check from a RTSP/RTP client using live libraries that
the server has dropped the connection, or if one or more of the RTP streams
have died?
Thanks,
Debargha.
**
Debargha Mukherjee, Ph.D.
Senior Research Scientist
Thanks Ross for the pointers.
First, is it possible to explain using TaskScheduler::scheduleDelayedTask(), as
you suggested, a little better?
Second, I also suspect that the freezing issue has to do with the timestamps
and the duration.
I am setting the duration in microsecs as 26122 (for 44.1
Hi,
I am developing a multi-threaded application using live libraries that receives
audio and video from several Axis IP cameras concurrently as RTSP clients,
decodes into raw, processes the data, creates a single outbound audio/video
stream, re-encodes and streams it out to a remote receiver a
Hi,
I am experimenting with receiving streamed data from Axis 207W cameras using (a
modified version of) the openRTSP utility, and is confronted with the following
issue. When I use the -q option to receive a quicktime file, I can later use
the ffmpeg avcodec libraries to read packets and decod
11 matches
Mail list logo