Hi Ross,
I want to modify live555 code to work it as an RTSP proxy,
please guide me what all would I require to do do we already have proxy
support I think we dont.
Many Thanks,
Nishesh
___
live-devel mailing list
live-devel@lists.live555.com
http://list
Hi,
I am using live555 with set top box(STB) but the STB is not playing audio,
I am using libfaac for audio and Mpeg4 for video what I want to confirm is
that can the client request for a codec format it understands to a server ??
I read RTSP rfc and found ANNOUNCE can be used from client to desir
Hi ,
for my project I need to use live555 as RTSP proxy...the requirement is to
take an RTSP url from client ex VLC at live555 and pass that URL to the
application which I want to make on top of live555 this application will
proxy the RTSP url (rtsp://ip/NA123) to the RTSP server and on sessio
Hi ,
following is the bt from brakepoint in gdb.. when we receive DESCRIBE from
client.
I put brakepoint in doGetNextFrame...in normal scenario(working case) this
hits only when we receive PLAY commandbut here its getting hit
when we receive DESCIBE and the reponse to DESCRIBE sent is either "
hi,
I am getting following response when trying to fetch rtsp
url(rtsp://ip/obama)
Response: *DESCRIBE* RTSP/1.0 404 File Not Found, Or In Incorrect Format\r\n
I have created RTSP server
rtspServer->addServerMediaSession(sms);
invoked announceStream...
I am deugging this for last few days but iss
Hi Ross,
I am reading a/v from live source through socket in 2 different threads, In
my aplication I am using class derived from FramedSource and in
doGetNextFrame() I read from cyclic buffers and set the fPresentationTime
etc, I wonder how can I use *hasBeenSynchronizedUsingRTCP and make sure the
Hi
I am working on an existing application which uses live555 and reads raw
audio and video from socket in 2 diff threads...i
t then encode audio and video and write in a cyclic buffers in the class
derived from FramedSourcei noticed that sending
RTCP SR was disabled I enabled it and after that
hi,
I am using MPEG4VideoFileServerMediaSubsession in my application which is
derived from FileServerMediaSubsession which in turn is derived from
OnDemandServerMediaSubsession,
does that mean I dont have to create RTCPInstance explicitly ?
but yet I do not see any SR (send report) packet going fr
hi all,
I am using live555 to stream out encoded a/v and playing the a/v through vlc
client using rtsp url, the problem here is audio/video is not in sync..
audio is lagging video slightly..when I put ethereal on client side(vlc
side) I see only RTCP RR packets and no SR packets are seen, I am jus