Re: [Live-devel] Live555 Streaming from a live source

2010-07-27 Thread Ross Finlayson
how is the presentationtime of two streams synchronised? Please read the FAQ! I have to synchronise the mpeg-4 es and a wave file. I am able to send the two streams together by creating single servermediasession and adding two separate servermediasubsession, but they are not synchronised. I

Re: [Live-devel] Live555 Streaming from a live source

2010-07-26 Thread Nisha Singh
how is the presentationtime of two streams synchronised? I have to synchronise the mpeg-4 es and a wave file. I am able to send the two streams together by creating single servermediasession and adding two separate servermediasubsession, but they are not synchronised. In case of mpeg-4 es video,

Re: [Live-devel] Live555 Streaming from a live source

2010-07-25 Thread Ross Finlayson
We successfully combined the two streams into one stream and it works great. Good. As you figured out, you can do this just by creating a single "ServerMediaSession" object, and adding two separate "ServerMediaSubsessions" to it. The Audio and Video are on the same url address. As it seem

Re: [Live-devel] Live555 Streaming from a live source

2010-07-22 Thread Sagi Ben Moshe
Of Sagi Ben Moshe Sent: Wednesday, July 21, 2010 4:07 PM To: live-de...@ns.live555.com Subject: Re: [Live-devel] Live555 Streaming from a live source Hi Ross, We have implemented a stream for AAC audio and it works great, we also implement a stream for H.264 and it also works great. We would like

Re: [Live-devel] Live555 Streaming from a live source

2010-07-21 Thread Sagi Ben Moshe
, July 14, 2010 5:43 PM To: 'Enrique Polak' Subject: FW: [Live-devel] Live555 Streaming from a live source You can stream AAC audio using a "MPEG4GenericRTPSink", created with appropriate parameters to specify AAC audio. (Note, for example, how "ADTSAudioFileServerMediaSubs

Re: [Live-devel] Live555 Streaming from a live source

2010-07-14 Thread Ross Finlayson
We are checking for audio stream support with Live555 and we would like to know if we can stream the following codec AAC-LC and/or AAC-HE through the library. Yes, you can do so using a "MPEG4GenericRTPSink", created with appropriate parameters to specify AAC audio. (Note, for example, how "A

Re: [Live-devel] Live555 Streaming from a live source

2010-07-13 Thread Sagi Ben Moshe
] On Behalf Of Ross Finlayson Sent: Sunday, July 11, 2010 4:21 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 Streaming from a live source >For the next stage we would like to use H264 codec, so I think we should >write our own H264VideoStreamDiscreteFra

Re: [Live-devel] Live555 Streaming from a live source

2010-07-11 Thread Ross Finlayson
For the next stage we would like to use H264 codec, so I think we should write our own H264VideoStreamDiscreteFramer, is it correct? Yes, you need to write your own subclass of "H264VideoStreamFramer"; see http://www.live555.com/liveMedia/faq.html#h264-streaming -- Ross Finlayson Live Network

Re: [Live-devel] Live555 Streaming from a live source

2010-07-11 Thread Sagi Ben Moshe
- development & use Subject: Re: [Live-devel] Live555 Streaming from a live source Hi Ross, Ok, we used the StreamParser class and probably this cause the problem we have. This is our Device class class CapDeviceSource: public FramedSource { We are trying to stream MPEG4 (Later on we will

Re: [Live-devel] Live555 Streaming from a live source

2010-07-11 Thread Ross Finlayson
We are trying to stream MPEG4 (Later on we will move to H.264) What is the best class to derive from instead of FramedSource in order to use DiscreteFramer downstream object? Provided that your source object delivers one frame at a time, you should be able to feed it directly into a "MPEG4Vid

Re: [Live-devel] Live555 Streaming from a live source

2010-07-10 Thread Sagi Ben Moshe
e555 Streaming from a live source >I did not sure I understand your last statement "make sure that your >downstream object always has enough buffer space to avoid trunction - i.e., >so that fMaxSize is always >= fFrameSize". How can I assure it, the Live555 >library request 15

Re: [Live-devel] Live555 Streaming from a live source

2010-07-09 Thread Ross Finlayson
I did not sure I understand your last statement "make sure that your downstream object always has enough buffer space to avoid trunction - i.e., so that fMaxSize is always >= fFrameSize". How can I assure it, the Live555 library request 150,000 bytes exactly. This is true only for the "StreamPa

Re: [Live-devel] Live555 Streaming from a live source

2010-07-08 Thread Sagi Ben Moshe
i -Original Message- From: live-devel-boun...@ns.live555.com [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson Sent: Friday, July 09, 2010 8:45 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Live555 Streaming from a live source >We are

Re: [Live-devel] Live555 Streaming from a live source

2010-07-08 Thread Ross Finlayson
We are trying to stream from a live source with Live555. We implement our own DeviceSource class. In this class we implement doGetNextFrame in the following (logic) way. We remove all the unnecessary implementation details so you can see the idea If no frame is available do the following ne

[Live-devel] Live555 Streaming from a live source

2010-07-08 Thread Sagi Ben Moshe
Hi, We are trying to stream from a live source with Live555. We implement our own DeviceSource class. In this class we implement doGetNextFrame in the following (logic) way. We remove all the unnecessary implementation details so you can see the idea If no frame is available do the foll