I am trying to stream video from a platform that
has a Sensoray 2250 encoder installed on it. I
have already verified that all of the hardware
works and that data produced from it can be
streamed using the live library by doing the
following:
· Created a process that grabs MPEG2
frames from the encoder and dumps them to a
named pipe
· Modified testMPEG1or2VideoStreamer to
look at this pipe for data, and stream it to a
specified IP address
· Works fine, except that, as you can
image, there is some major delay in the video
which I cannot have due to the nature of what
this video stream is being used for.
You can probably eliminate most of this delay by
reducing the maximum buffering used by your pipe.
(I don't know how you would do this, but there
should be a way.)
This method - piping encoded data directly to the
(modified) "testMPEG1or2VideoStreamer" is *by
far* the easiest way to get what you want. I
recommend sticking with this approach if you can.
I decided to move on and write a FramedSource
subclass that encapsulates my encoder, and
deliver this object directly to a
MPEG1or2VideoRTPSink.
You should insert a
"MPEG1or2VideoStreamDiscreteFramer" in front of
your "MPEG1or2VideoRTPSink".
The thing that is very odd is that it seems like
my implementation of doGetNextFrame() is never
executed
You should first make sure that you can walk
before you try to run. I suggest that you begin
by trying to write your encoded data to a file
rather than streaming it over a network. I.e.,
start by using a "FileSink" instead of a
"MPEG1or2VideoRTPSink". Once you have shown that
you can successfully write your encoded data into
a file (and test that the data is correct by
trying to play the file in a media player), then
it would make sense to move to streaming.
--
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel