[Live-devel] Combining Audio and Video Streams

2007-06-12 Thread Severin Schoepke
doneFlag = ~0; } else { // No luck yet. Try again, after a brief delay: int uSecsToDelay = 10; // 100 ms env->taskScheduler().scheduleDelayedTask(uSecsToDelay, (TaskFunc*)checkForAuxSDPLine, sink); } } Bo

Re: [Live-devel] Problem Streaming MPEG4ES from buffer

2007-06-06 Thread Severin Schoepke
Hi Julian, I can stream ffmpeg encoded MPEG4 buffers using an MPEG4VideoStreamFramer perfectly well. I suppose you try that! cheers, Severin Julian Lamberty schrieb: > Do I have to use "MPEG4VideoStreamFramer" instead as the test program > does? __

Re: [Live-devel] Synchronizing audio and video streams

2007-06-04 Thread Severin Schoepke
Hello again! I investigated a little deeper and here is what I came up with: Just to recap: I have two threads, one is reading audio and video frames and stores them in two queues. Another thread reads these queues, encodes the frames to MPEG4 and MP3, and streams them out via live555. The liv

Re: [Live-devel] Synchronizing audio and video streams

2007-06-01 Thread Severin Schoepke
Hello again, I'm referring to the following mail: I'm working on an application that streams live generated content (audio and video) using the Darwin Streamnig Server. I decided to use ffmpeg for the encoding part and live555 for the streaming part. The basic architecture is as follows: In a

[Live-devel] Synchronizing audio and video streams

2007-05-25 Thread Severin Schoepke
Hello list, I'm working on an application that streams live generated content (audio and video) using the Darwin Streamnig Server. I decided to use ffmpeg for the encoding part and live555 for the streaming part. The basic architecture is as follows: In a first thread I generate the content, a