Re: [Live-devel] Use RTP Seperately

2009-07-15 Thread Ross Finlayson
You replied to Sean: If you're streaming MPEG-4 video via RTP, then you must use a RTSP server. It is my understanding that one purpose of RTSP when streaming MPEG4 video is to communicate the all-important SDP info to the client player. And I understand that's why Sean and I (and the whole

Re: [Live-devel] MP4 Streamer Doubt

2009-07-15 Thread Ross Finlayson
So, can I say : any source that is workable for MPEG2TransportStreamFromPESSource are ALWAYS applicable to MPEG2TransportStreamFromESSource? You can say that, but you would be wrong. Once again: If your input is in the form of 'PES packets' (this usually occurs only if you are demultiplexing

Re: [Live-devel] Use RTP Seperately

2009-07-15 Thread Russell, Michael (mrusse05)
Hi Ross - Your response to Sean prompts me to ask for clarification for the both of us. You replied to Sean: >If you're streaming MPEG-4 video via RTP, then you must use a RTSP server. It is my understanding that one purpose of RTSP when streaming MPEG4 video is to communicate the all-impo

Re: [Live-devel] MP4 Streamer Doubt

2009-07-15 Thread Woods
Hi, Thanks for your reply. So, can I say : any source that is workable for MPEG2TransportStreamFromPESSource are ALWAYS applicable to MPEG2TransportStreamFromESSource? Regards, Woods On Wed, Jul 15, 2009 at 9:02 PM, Ross Finlayson wrote: > Subsequently, there have two candidate livemedia sou

Re: [Live-devel] MP4 Streamer Doubt

2009-07-15 Thread Ross Finlayson
Subsequently, there have two candidate livemedia sources: MPEG2TransportStreamFromPESSou rce and MPEG2TransportStreamFromESSource Which one should I use? What is their key difference? "MPEG2TransportStreamFromPESSource" should be used only when your input is already in the form of 'PES packet

Re: [Live-devel] Use RTP Seperately

2009-07-15 Thread Ross Finlayson
Does anyone know if I can use the RTP Stack seperately with live555? I'm going to transmit MPEG4 stream from one endpoint to another one, and RTSP/SIP is not involved for the transmission If you're streaming MPEG-4 video via RTP, then you must use a RTSP server. Note our demo applications - "t

Re: [Live-devel] MP3FileSource

2009-07-15 Thread Ross Finlayson
My application creates a FramedSource from an MPEG-1, Layer 3 (.mp3) audio file and feeds it to an input of MPEG2TransportStreamFromESSource like this: ByteStreamFileSource* audioFileSource = ByteStreamFileSource::createNew(*env, filename); FramedSource* audioES = audioFileSource; MPEG1or2Au

Re: [Live-devel] config.uClinux config changes

2009-07-15 Thread Ross Finlayson
The config.uClinux provided in live 2009.07.09 version does not properly set the compiler options for uClinux environment. I suggest the following fixes to config.uClinux to make liveMedia Streaming server properly compile on uClinux environment. Thanks. I'll include this in the next release

[Live-devel] Some questions about MJPEG streaming

2009-07-15 Thread 俊利大大
Hi All I'm working on live555 library for a MJPEG strerming server. It's really complicate to me to understand it even I read FAQ , RFC2035 and Elphel sample code. As I understand, what I need to do is to setup rtp, rtcp, rtsp APIs and provide JPEG header and JPEG payload to feed r

[Live-devel] MP4 Streamer Doubt

2009-07-15 Thread Woods
Hi experts, I am writing a MP4 Streamer, which will stream mpeg4 video and audio(mp3,mp2, whatever) as Mpeg2 transport stream. I will write code to extract elementary stream payload from MP4 file. Subsequently, there have two candidate livemedia sources: MPEG2TransportStreamFromPESSource and MPEG2

[Live-devel] How to develope a RTSP server for streaming AVI

2009-07-15 Thread huang jon
Hi, I’ve look at the live555,and I can run is in Linux to live a mpeg4 file, Now,I want to use it to live a media buffer, it include three format video, H264/MPEG4/MJPEG. How can I do it? It can write the buffer to avi file and live avi file? Thanks very much! ___

[Live-devel] config.uClinux config changes

2009-07-15 Thread Chetan Raj
The config.uClinux provided in live 2009.07.09 version does not properly set the compiler options for uClinux environment. I suggest the following fixes to config.uClinux to make liveMedia Streaming server properly compile on uClinux environment. - Chetan ===

[Live-devel] Use RTP Seperately

2009-07-15 Thread Sean
Greetings, Does anyone know if I can use the RTP Stack seperately with live555? I'm going to transmit MPEG4 stream from one endpoint to another one, and RTSP/SIP is not involved for the transmission, Is there any test programs for live555 on such usage? BRs, Sean __

[Live-devel] MP3FileSource

2009-07-15 Thread Michael Russell
My application creates a FramedSource from an MPEG-1, Layer 3 (.mp3) audio file and feeds it to an input of MPEG2TransportStreamFromESSource like this: ByteStreamFileSource* audioFileSource = ByteStreamFileSource::createNew(*env, filename); FramedSource* audioES = audioFileSource; MPEG1or2Au

[Live-devel] HW encoder Davinchi

2009-07-15 Thread avicode
Hi   I can stream h264 from HW Enc (Davinci) to VLC but I have problems when streaming both G711 and  h264. I think is related to the synchronization between my threads VidEnc , AudEnc and Live thread.   I understand that  BasicTaskScheduler::SingleStep waits for events using select, and i