You replied to Sean:
If you're streaming MPEG-4 video via RTP, then you must use a RTSP server.
It is my understanding that one purpose of RTSP when streaming MPEG4
video is to communicate the all-important SDP info to the client
player. And I understand that's why Sean and I (and the whole
So, can I say : any source that is workable for
MPEG2TransportStreamFromPESSource are ALWAYS applicable to
MPEG2TransportStreamFromESSource?
You can say that, but you would be wrong.
Once again: If your input is in the form of 'PES packets' (this
usually occurs only if you are demultiplexing
Hi Ross -
Your response to Sean prompts me to ask for clarification for the both of us.
You replied to Sean:
>If you're streaming MPEG-4 video via RTP, then you must use a RTSP server.
It is my understanding that one purpose of RTSP when streaming MPEG4 video is
to communicate the all-impo
Hi,
Thanks for your reply.
So, can I say : any source that is workable for
MPEG2TransportStreamFromPESSource are ALWAYS applicable to
MPEG2TransportStreamFromESSource?
Regards,
Woods
On Wed, Jul 15, 2009 at 9:02 PM, Ross Finlayson wrote:
> Subsequently, there have two candidate livemedia sou
Subsequently, there have two candidate livemedia sources:
MPEG2TransportStreamFromPESSou
rce and MPEG2TransportStreamFromESSource
Which one should I use? What is their key difference?
"MPEG2TransportStreamFromPESSource" should be used only when your
input is already in the form of 'PES packet
Does anyone know if I can use the RTP Stack seperately with live555?
I'm going to transmit MPEG4 stream from one endpoint to another one,
and RTSP/SIP is not involved for the transmission
If you're streaming MPEG-4 video via RTP, then you must use a RTSP
server. Note our demo applications - "t
My application creates a FramedSource from an MPEG-1, Layer 3 (.mp3)
audio file and feeds it to an input of MPEG2TransportStreamFromESSource
like this:
ByteStreamFileSource* audioFileSource =
ByteStreamFileSource::createNew(*env, filename);
FramedSource* audioES = audioFileSource;
MPEG1or2Au
The config.uClinux provided in live 2009.07.09 version does not
properly set the compiler options for uClinux environment.
I suggest the following fixes to config.uClinux to make liveMedia
Streaming server properly compile on uClinux environment.
Thanks. I'll include this in the next release
Hi All
I'm working on live555 library for a MJPEG strerming server. It's really
complicate to me to understand it even I read FAQ , RFC2035 and Elphel sample
code.
As I understand, what I need to do is to setup rtp, rtcp, rtsp APIs and
provide JPEG header and JPEG payload to feed r
Hi experts,
I am writing a MP4 Streamer, which will stream mpeg4 video and
audio(mp3,mp2, whatever) as Mpeg2 transport stream.
I will write code to extract elementary stream payload from MP4 file.
Subsequently, there have two candidate livemedia sources:
MPEG2TransportStreamFromPESSource and MPEG2
Hi,
I’ve look at the live555,and I can run is in Linux to live a mpeg4
file,
Now,I want to use it to live a media buffer, it include three format video,
H264/MPEG4/MJPEG.
How can I do it? It can write the buffer to avi file and live avi file?
Thanks very much!
___
The config.uClinux provided in live 2009.07.09 version does not properly set
the compiler options for uClinux environment.
I suggest the following fixes to config.uClinux to make liveMedia Streaming
server properly compile on uClinux environment.
- Chetan
===
Greetings,
Does anyone know if I can use the RTP Stack seperately with live555?
I'm going to transmit MPEG4 stream from one endpoint to another one,
and RTSP/SIP is not involved for the transmission,
Is there any test programs for live555 on such usage?
BRs,
Sean
__
My application creates a FramedSource from an MPEG-1, Layer 3 (.mp3)
audio file and feeds it to an input of MPEG2TransportStreamFromESSource
like this:
ByteStreamFileSource* audioFileSource =
ByteStreamFileSource::createNew(*env, filename);
FramedSource* audioES = audioFileSource;
MPEG1or2Au
Hi
I can stream h264 from HW Enc (Davinci) to VLC but I have problems
when streaming both G711 and h264.
I think is related to the synchronization between my threads
VidEnc , AudEnc and Live thread.
I understand that BasicTaskScheduler::SingleStep
waits for events using select, and i
15 matches
Mail list logo