how is the presentationtime of two streams synchronised?
Please read the FAQ!
I have to synchronise the mpeg-4 es and a wave file. I am able to
send the two streams together by creating single servermediasession
and adding two separate servermediasubsession, but they are not
synchronised.
I
how is the presentationtime of two streams synchronised?
I have to synchronise the mpeg-4 es and a wave file. I am able to send the two
streams together by creating single servermediasession and adding two separate
servermediasubsession, but they are not synchronised.
In case of mpeg-4 es video,
We successfully combined the two streams into one stream and it works great.
Good. As you figured out, you can do this just by creating a single
"ServerMediaSession" object, and adding two separate
"ServerMediaSubsessions" to it.
The Audio and Video are on the same url address. As it seem
Of Sagi Ben Moshe
Sent: Wednesday, July 21, 2010 4:07 PM
To: live-de...@ns.live555.com
Subject: Re: [Live-devel] Live555 Streaming from a live source
Hi Ross,
We have implemented a stream for AAC audio and it works great, we also
implement a stream for H.264 and it also works great. We would like
, July 14, 2010 5:43 PM
To: 'Enrique Polak'
Subject: FW: [Live-devel] Live555 Streaming from a live source
You can stream AAC audio using a "MPEG4GenericRTPSink", created with
appropriate parameters to specify AAC audio. (Note, for example, how
"ADTSAudioFileServerMediaSubs
We are checking for audio stream support with Live555 and we would like to
know if we can stream the following codec
AAC-LC and/or AAC-HE through the library.
Yes, you can do so using a "MPEG4GenericRTPSink", created with
appropriate parameters to specify AAC audio. (Note, for example, how
"A
] On Behalf Of Ross Finlayson
Sent: Sunday, July 11, 2010 4:21 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Live555 Streaming from a live source
>For the next stage we would like to use H264 codec, so I think we should
>write our own H264VideoStreamDiscreteFra
For the next stage we would like to use H264 codec, so I think we should
write our own H264VideoStreamDiscreteFramer, is it correct?
Yes, you need to write your own subclass of "H264VideoStreamFramer";
see http://www.live555.com/liveMedia/faq.html#h264-streaming
--
Ross Finlayson
Live Network
- development & use
Subject: Re: [Live-devel] Live555 Streaming from a live source
Hi Ross,
Ok, we used the StreamParser class and probably this cause the problem we
have.
This is our Device class
class CapDeviceSource: public FramedSource {
We are trying to stream MPEG4 (Later on we will
We are trying to stream MPEG4 (Later on we will move to H.264)
What is the best class to derive from instead of FramedSource in order to
use DiscreteFramer downstream object?
Provided that your source object delivers one frame at a time, you
should be able to feed it directly into a
"MPEG4Vid
e555 Streaming from a live source
>I did not sure I understand your last statement "make sure that your
>downstream object always has enough buffer space to avoid trunction - i.e.,
>so that fMaxSize is always >= fFrameSize". How can I assure it, the
Live555
>library request 15
I did not sure I understand your last statement "make sure that your
downstream object always has enough buffer space to avoid trunction - i.e.,
so that fMaxSize is always >= fFrameSize". How can I assure it, the Live555
library request 150,000 bytes exactly.
This is true only for the "StreamPa
i
-Original Message-
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson
Sent: Friday, July 09, 2010 8:45 AM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Live555 Streaming from a live source
>We are
We are trying to stream from a live source with Live555.
We implement our own DeviceSource class. In this class we implement
doGetNextFrame in the following (logic) way. We remove all the
unnecessary implementation details so you can see the idea
If no frame is available do the following
ne
Hi,
We are trying to stream from a live source with Live555.
We implement our own DeviceSource class. In this class we implement
doGetNextFrame in the following (logic) way. We remove all the unnecessary
implementation details so you can see the idea
If no frame is available do the foll
15 matches
Mail list logo