I am planning to use live555 to stream video and audio to flash media
server. I am not sure if that is supported in live555. Can anyone please
give me some direction if that is possible using live555 and how can we
do that. I have used live555 to stream using RTSP protocol but not sure
how can
Hi,
I am using live555 to create a unicast streaming server. I am using VLC
as client to test the server. I observed that after few hours VLC thinks
the connection is dropped and hence stops rendering video and audio. I
debugged it and found that streaming server is sending SR but is not
gett
Hi,
I read that live555 is LGPL licensed but the build for live555 creates
static libraries. If I am correct than LGPL dictates that the
application links to it dynamically. Is there any plan to change the
build type to dynamic for live555 or do I have to create dynamic library
myself which w
Hi,
I implemented a device source with Ross's help and made it to work. Now
I am getting nice video but my audio is choppy and is not good. So my
question is what is the right way of timestamping audio and video. I
read that if your source is live than no need to set
fFrameDurationInMicroSeco
Hi Ross,
I tried the latest live555 but still same issue.
==
v=0
o=- 1383204769034542 1 IN IP4 192.168.1.104
s=Session streamed by "Ekomsys"
i=Streaming channel using Ekomsys streaming server
t=0 0
a=tool:LIVE555 Streaming Media v2013.10.03
a=type:broadca
Hi,
I found that when I was checking my streaming using ffmpeg the ffmpeg
displays the channels as 1 and audio codec to be (null). I am using
AC3RTPSink in live555. I tried to debug the source of live555 and found
that the fNumChannels is not set while creating object (calling base
class Audi
Hi,
I tried to use openRTSP.exe on the output with option -F to save the
audio and video and i found that audio is getting saved just fine
thought i was not able to play back the video in saved file but at least
i know that my audio is being streamed. But i am not sure why VLS is not
renderin
Hi,
Thanks for the guidance and I tried it. Now I can see the packets being
transmitted in testRTSPClient and also in VLC. VLC shows packets being
decoded for both audio and video. But still somehow video is running
smooth but audio just comes and goes. Its like something is choking
audio. It
Thank you Ross, That solved my issue with audio packets being properly
transmitted. I am still facing issue with Audio at reception side. When
I try with VLC it gives ausio for half a second and then stops. If i
stop and start VLC again it agian gives audiio for half a second and
stops. Then i
Any input on this?
On 2013-10-23 11:49, ssi...@neurosoft.in wrote:
Thank you Ross for clarification, its more clear now. Now I am facing
issue that i have separate thread that pushes audio packets for my
device source to stream. I trigger event each time I push packet to
that queue. I noticed th
Thank you Ross for clarification, its more clear now. Now I am facing
issue that i have separate thread that pushes audio packets for my
device source to stream. I trigger event each time I push packet to that
queue. I noticed that on VLC my audio comes for about a second and then
stops. When I
Hi,
I am confused as how the event mechanism works in live555. I have a
source that is fed with video and audio frames and I want to trigger
doGetNextFrame() of my custom DeviceSource so that those frames are
streamed using live555. For this I am using
m_eventID = envir().taskScheduler().cr
Another thing I noticed is that the VLC shows lower bitrate. When I use
tets programs it shows > 1000 and with my program its showing around
100-200, which can be a source of problem. Which factor determimes
bitrate in live555. In one of your post you said its entirely
'fPresentationTime'. I am
Thank you Ross for all the help. I finally managed to get it working and
now i have both audio and video being streamed. I dont know but somehow
the refresh rate (as seen on vlc) is not good. My video frames comes
incomplete and it takes time for vlc to paint the whole frame and audio
stops aft
I tried the unicast server based on onDemand sample but my function
"getAuxSDPLine()" never returns. When i debugged i found that function
"checkForAuxSDPLine1()" calls the function "fDummyRTPSink->auxSDPLine()"
which is the function implemented in MPEG4RTPSink which in turn tried to
get the po
Thanks for the input and i rellay appreciate your help. If i subclass
ServerMediaSubsession than I need to implement other functions too. Is
there a another class which will do most for me. I noticed that
OnDemandMediaSubsession is being used for some subsession implementation
but is it only fo
I tried sending this extra data as first packet and also tried sending
it with each packet but no use. I debuggedit and found that when i send
this packet my MPEG4discreteFrame found all the information it needed
(debugged function analyzeVOLHeader()) but still my tetsRTSPClient does
not show a
Thank you for the clarification. I will try to send this extra data as
first packet. I have another confusion. if i send this packet as first
packet when there was no client attached to it and later on client
attached to it than how will that client get this packet as it will be
lost. Right now
Hi,
I spend almost last 3 days banging my head with RFCs and other
documents. Here is what I want to achieve. I have raw BGRA frames and I
want to stream them using live555. This is what I am doing
I created a subclass of FramedSource which is responsible to encode the
raw BGRA frame into MP
19 matches
Mail list logo