Re: [Live-devel] Live stream of Axis camera

2008-05-16 Thread Manuel Carrizo
On Tue, May 6, 2008 at 6:36 PM, Ross Finlayson <[EMAIL PROTECTED]> wrote: > Because you are streaming from a live source, then you may instead be able > to use a simple "FramedSource" subclass (that you would write) that just > delivers one AAC frame at a time (and sets "fPresentationTime" > approp

[Live-devel] I'm interested in getting some problems with VLC (using live555 libraries) fixed ...

2008-05-16 Thread Lewis G. Pringle, Jr.
Folks: I would like to talk to someone about some problems I’m having getting VLC (which uses your library) to talk with Microsoft’s Windows Media Server (specifically – end of stream detection, for starters). Lewis.

Re: [Live-devel] streaming and receiving with circular buffers

2008-05-16 Thread kevin fesselier
Ross Finlayson a écrit : What I'm trying to achieve now is to read data to be send from a buffer (server part), and to write it into a buffer (client part). These buffers must be circular buffers, as it is usual on set top boxes. So, my question is : What is the simplest way to achieve this (I

Re: [Live-devel] streaming and receiving with circular buffers

2008-05-16 Thread Ross Finlayson
What I'm trying to achieve now is to read data to be send from a buffer (server part), and to write it into a buffer (client part). These buffers must be circular buffers, as it is usual on set top boxes. So, my question is : What is the simplest way to achieve this (I don't want to use some pip

Re: [Live-devel] Live stream from network camera to streaming server object

2008-05-16 Thread Ross Finlayson
Ok, I start to catch your idea. What do you mean by using "RTPSink" objects directly from the corresponding "RTPSource"? - Use the client code to open the RTSP stream - Create each RTPSink - For each RTPSink that you just created, call sink->startPlaying(rtpSource, ...); - Call "doEvent

[Live-devel] streaming and receiving with circular buffers

2008-05-16 Thread kevin fesselier
Hello, I'm a student in computer science, beginner with the live555 library. I have been writing some simple test programs to implement some simple use cases such as streaming MP3 or MPEG2TS. I manage to stream data from a disk on the server, thanks to a ByteStreamFileSource, and a SimpleRTPSi

Re: [Live-devel] Live stream from network camera to streaming server object

2008-05-16 Thread Vadim
Ok, I start to catch your idea. What do you mean by using "RTPSink" objects directly from the corresponding "RTPSource"? Using openRTSP sample I went into troubles to get RTPSink from RTSPClient, Media..., ... flow. This is the only point that stops me to check your previous proposal. Thanks,

Re: [Live-devel] Live stream from network camera to streaming server object

2008-05-16 Thread Ross Finlayson
A correction to my earlier message. Because your server will be streaming the relayed stream via multicast rather than via unicast, you don't need to write any "ServerMediaSession" subclasses - instead, just continue to use a "PassiveServerMediaSubsession". I think all you need to do is play

Re: [Live-devel] Live stream from network camera to streaming server object

2008-05-16 Thread Ross Finlayson
The application is a multithreaded process. Please read the FAQ entry on threads. You should be able to do what you want using a single-threaded process, using a single event loop. Are there any other ways exist to do it using LM infrastructure? Yes, but not without writing more code (se

Re: [Live-devel] Live stream from network camera to streaming server object

2008-05-16 Thread Vadim
Thanks for your response. Piping is not an option, because audio stream will be relayed two. The solution you proposed already checked, and seems like a "hacked" way to do it. The application is a multithreaded process. Are there any other ways exist to do it using LM infrastructu