Re: [Live-devel] HTTP Live Streaming

2013-10-24 Thread Ross Finlayson
> I've found a reference on what would be the proper behavior, In Unix Network > Programming, Volume 1: The Sockets Networking API (3rd Edition), it is noted > that a socket will be ready for writing if the write half of the connection > is closed, but a write will generate a SIGPIPE. So a chec

Re: [Live-devel] True push DeviceSource

2013-10-24 Thread Ross Finlayson
On Oct 24, 2013, at 11:35 AM, ssi...@neurosoft.in wrote: > Any input on this? Because of this violation of basic email netiquette - i.e., posting the same question to the mailing list multiple times - future postings from you will be moderated. Please don't do this again, otherwise you'll be

Re: [Live-devel] HTTP Live Streaming

2013-10-24 Thread Pak Man Chan
On Thu, Oct 24, 2013 at 9:15 PM, Ross Finlayson wrote: > Thanks for the clarifications, I've looked further and found that send() > is returning -1, with errno being set to EPIPE. The problem is in "select" > in BasicTaskScheduler::SingleStep still indicating a writable socket even > when it is di

Re: [Live-devel] FrameSource:Getnextframe error while streamingPCMframes

2013-10-24 Thread Ross Finlayson
I think your problem is here: void triggerLive555Scheduler(void) { scheduler->triggerEvent(WAVSource::s_frameReceivedTrigger, sessionState.source); } The problem with this is the second parameter to "triggerEvent()". It needs to be a pointer to a "WAVSource" object. If you are streami

Re: [Live-devel] True push DeviceSource

2013-10-24 Thread Ross Finlayson
> Thank you Ross for clarification, its more clear now. Now I am facing issue > that i have separate thread that pushes audio packets for my device source to > stream. I trigger event each time I push packet to that queue. I noticed that > on VLC my audio comes for about a second and then stops.

Re: [Live-devel] True push DeviceSource

2013-10-24 Thread ssingh
Any input on this? On 2013-10-23 11:49, ssi...@neurosoft.in wrote: Thank you Ross for clarification, its more clear now. Now I am facing issue that i have separate thread that pushes audio packets for my device source to stream. I trigger event each time I push packet to that queue. I noticed th

[Live-devel] Changing from Multicast to Unicast

2013-10-24 Thread Piers Hawksley
Hi Ross, Using the following code I can stop a server media session and restart it (with different parameters such as multicast address & port). I can also change from unicast to multicast and back. However when I change from multicast to unicast the multicast stream continues until I request

Re: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue)

2013-10-24 Thread Bob Bischan
Fair enough :-) I will contact you outside the dev-list to explore options. thanks, bob On Oct 24, 2013 8:58 AM, "Ross Finlayson" wrote: > The recording files for N > 1 clients work fine with every desktop player > I use (VLC, Totem, QuickTime, ffplay, Mplayer...etc), however the file no > lon

Re: [Live-devel] FrameSource:Getnextframe error while streamingPCMframes

2013-10-24 Thread Krishna
Hi Ross, I have attached 1. my Device source file Wavsource.cpp 2. WaveStreamer .cpp( took a reference from testWavAudioStreamer.cpp) where I have thread to read the samples and have code for initialization and starting the session. Regards From: Ross Finlayson Sent: Thursday, October 24, 20

Re: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue)

2013-10-24 Thread Ross Finlayson
> The recording files for N > 1 clients work fine with every desktop player I > use (VLC, Totem, QuickTime, ffplay, Mplayer...etc), however the file no > longer validates as html5 video. Why don't you try to find out why that is? In any case, this does not appear to be a problem that I can spen

Re: [Live-devel] Proxy Server (Multiple Client / Recorded File Issue)

2013-10-24 Thread Bob Bischan
Ross, Thanks for your patience and time responding to this issueI do understand that many of my questions and inquiries are probably outside the scope of this development list. With that said, I will be brief in my comments. With your input and running through numerous permutations from an im

Re: [Live-devel] HTTP Live Streaming

2013-10-24 Thread Ross Finlayson
> I missed the check on "fLimitNumBytesToStream". In this case, should > "fLimitNumBytesToStream" be initialized to False? You're right - this is a bug. I've just installed a new version (2013.10.24) of the code that fixes this. Thanks again for the report. > Thanks for the clarifications,

Re: [Live-devel] FrameSource:Getnextframe error while streaming PCMframes

2013-10-24 Thread Ross Finlayson
> I found the problem that uLawFromPCMAudioSource afterGettingFrame is not > getting called when I use DeviceSource based design and triggering concept. > i.e. > If I am calling FramedSource::afterGetting(this) in doGetNextFrame itself , > it is calling afterGettingFrame function in uLawFromPC

Re: [Live-devel] FrameSource:Getnextframe error while streaming PCMframes

2013-10-24 Thread Krishna
Hi Ross, I found the problem that uLawFromPCMAudioSource afterGettingFrame is not getting called when I use DeviceSource based design and triggering concept. i.e. If I am calling FramedSource::afterGetting(this) in doGetNextFrame itself , it is calling afterGettingFrame function in uLawFromPC

Re: [Live-devel] HTTP Live Streaming

2013-10-24 Thread Pak Man Chan
On Thu, Oct 24, 2013 at 2:51 PM, Ross Finlayson wrote: > 1. Replies to HTTP GET requests are sometime truncated. As an example, > curl http://serverip/somets.ts will sometimes result in only part of the > playlist > > I've traced this to fNumBytesToStream is not being initialized when > create

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-24 Thread Ross Finlayson
> I saw the FAQ in live555 website. > It said the live555 sync the audio and video by RTCP's SR packets. > So I should create RTCP instance for each RTP source explititly? No, because (assuming that you are controlling the streaming using RTSP) this is done implicitly. (In the RTSP server, this

Re: [Live-devel] Regarding the h264 video stream's rtp packet timestamp

2013-10-24 Thread Tony
I saw the FAQ in live555 website. It said the live555 sync the audio and video by RTCP's SR packets. So I should create RTCP instance for each RTP source explititly? At 2013-10-24 05:48:40,"Ross Finlayson" wrote: So how should I set the duration, then the audio and video would be sync. Yo