Ross,
Don't worry about answering my last email, I was able to get the live code working with my encoder. It works great, I appreciate all the work you have done on this project. On another note, to you and everyone else that might see this email, does anyone have any tips for optimizing latency when streaming live video through this library? Now that I have my encoder working, I am getting video on the client machine which is only delayed about 1-1.5 seconds from real time. Any suggestions for optimizing it a little more? I am streaming to the client wirelessly, but I have already tried it over a wired LAN, same delay results. Thanks to anyone who can help. -Jordan _____ From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Ross Finlayson Sent: Tuesday, July 17, 2007 12:12 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] Streaming with Sensoray 2250 encoded data I am trying to stream video from a platform that has a Sensoray 2250 encoder installed on it. I have already verified that all of the hardware works and that data produced from it can be streamed using the live library by doing the following: * Created a process that grabs MPEG2 frames from the encoder and dumps them to a named pipe * Modified testMPEG1or2VideoStreamer to look at this pipe for data, and stream it to a specified IP address * Works fine, except that, as you can image, there is some major delay in the video which I cannot have due to the nature of what this video stream is being used for. You can probably eliminate most of this delay by reducing the maximum buffering used by your pipe. (I don't know how you would do this, but there should be a way.) This method - piping encoded data directly to the (modified) "testMPEG1or2VideoStreamer" is *by far* the easiest way to get what you want. I recommend sticking with this approach if you can. I decided to move on and write a FramedSource subclass that encapsulates my encoder, and deliver this object directly to a MPEG1or2VideoRTPSink. You should insert a "MPEG1or2VideoStreamDiscreteFramer" in front of your "MPEG1or2VideoRTPSink". The thing that is very odd is that it seems like my implementation of doGetNextFrame() is never executed You should first make sure that you can walk before you try to run. I suggest that you begin by trying to write your encoded data to a file rather than streaming it over a network. I.e., start by using a "FileSink" instead of a "MPEG1or2VideoRTPSink". Once you have shown that you can successfully write your encoded data into a file (and test that the data is correct by trying to play the file in a media player), then it would make sense to move to streaming. -- Ross Finlayson Live Networks, Inc. http://www.live555.com/
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel