Re: [Live-devel] live555 rtsp server supports Live Media

2008-04-04 Thread Ross Finlayson
can we use rtsp server of Live555 to transfer Live Media. Do you mean "serve streams from a live input source" (e.g., a camera connected to a video encoder)? If so, the answer is yes. Although our "LIVE555 Media Server" product currently only serves files, it is possible to modify our demo

[Live-devel] live555 rtsp server supports Live Media

2008-04-04 Thread rajesh
Hi Ross, can we use rtsp server of Live555 to transfer Live Media. Thanks in Advance Thanks and Regards Rajesh Kumar Sr. Software Engineer R & D - Network Group +91 40 23555945 - 235 +91 99084 00027 www.imimobile.com ___ live-devel mailing list live-

Re: [Live-devel] presentation times

2008-04-04 Thread Ross Finlayson
>I am receiving a stream from Quicktime Broadcaster via Darwin >Streaming Server, where do the presentation times I get in my >MediaSink originate from? I think you mean "RTPSource" (subclass), not "MediaSink". The presentation times returned by the "RTPSource" object originated in the server (

Re: [Live-devel] Implementing FramedSource

2008-04-04 Thread Ross Finlayson
>I'm trying to implement a FramedSource which will get next frames from >a Queue. From my digging so far, "doGetNextFrame()" should be blocking >until the next frame is available No, it should *not* be blocking until the next frame is available, because if you block, you'll be starving out any ot

[Live-devel] presentation times

2008-04-04 Thread Lodewijk Loos
hi, I am receiving a stream from Quicktime Broadcaster via Darwin Streaming Server, where do the presentation times I get in my MediaSink originate from? My guess is that DSS adapts the presentation times of frames from QTB. I am asking this because I would like to be able to calculate the

Re: [Live-devel] SIP and RTP with network cam

2008-04-04 Thread Wouter Bin
We have a SIP server already. We have to make a client which calls another client with SIP. And then the client has to make a connection with the other client and send a audio/video RTP stream. The video stream gets it's data from a network camera (Motion JPEG) and the audio stream has to get its d

[Live-devel] Implementing FramedSource

2008-04-04 Thread Ken Seo
Hi, I'm trying to implement a FramedSource which will get next frames from a Queue. From my digging so far, "doGetNextFrame()" should be blocking until the next frame is available, so my current implementation looks like, void QueueSource::doGetNextFrame() { //CAutoLock cAutolock(&m_Lock)

Re: [Live-devel] [Question] Streaming transmission rate of liveMedia Server

2008-04-04 Thread Cristiano Belloni
Ross Finlayson wrote: >> >> I'm analyzing Live555MediaServer to find out how >> the server decides its transmission rate! >>> It's determined by the "fDurationInMicroseconds" parameter set for >>> the source object that feeds into each "RTPSink" (subclass). I.e

Re: [Live-devel] [Question] Streaming transmission rate of liveMedia Server

2008-04-04 Thread Ross Finlayson
> >> I'm analyzing Live555MediaServer to find out how >>> the server decides its transmission rate! >>> >> >> It's determined by the "fDurationInMicroseconds" parameter set for >> the source object that feeds into each "RTPSink" (subclass). I.e., >> its the data sources that determine the

Re: [Live-devel] [Question] Streaming transmission rate of liveMedia Server

2008-04-04 Thread Cristiano Belloni
Ross Finlayson wrote: >> I'm analyzing Live555MediaServer to find out how >> the server decides its transmission rate! >> > > It's determined by the "fDurationInMicroseconds" parameter set for > the source object that feeds into each "RTPSink" (subclass). I.e., > its the data sources that d

Re: [Live-devel] Audio and Video Streaming

2008-04-04 Thread Ross Finlayson
>I have created a instance of ServerMediaSession, and added one audio >and one video session to it using addSubsession. PTS for the video >is well taken care off by the encoder but for audio, since its wav >file, I am not sure is there is PTS in the audio frame. Is there is >anything else I am

Re: [Live-devel] Audio and Video Streaming

2008-04-04 Thread Robert Howard
I have created a instance of ServerMediaSession, and added one audio and one video session to it using addSubsession. PTS for the video is well taken care off by the encoder but for audio, since its wav file, I am not sure is there is PTS in the audio frame. Is there is anything else I am missing.

Re: [Live-devel] Many streams to one RTP port

2008-04-04 Thread Ross Finlayson
> I am working on a server which should receive many RTP >streams on the same port. The cameras may be distributed everywhere, >but they send the stream back to same port of the same server. Could >liveMedia be used to implement such a server? Not at present. We currently have no mecha

Re: [Live-devel] Audio and Video Streaming

2008-04-04 Thread Ross Finlayson
>I have two streams - one for audio and one for video. Is there is >anything I have to do for synchronisation? Currently when I stream >both of them, both audio and video appear with lot of jerks. Any >suggestion? 1/ Make sure that the presentation times for each frame in the source streams a

[Live-devel] Many streams to one RTP port

2008-04-04 Thread lcj.liu
Hello, I am working on a server which should receive many RTP streams on the same port. The cameras may be distributed everywhere, but they send the stream back to same port of the same server. Could liveMedia be used to implement such a server? thanks ___