Hi,
my name is Christopher and I am trying to integrate RTP functionality
from your liveMedia implementation into a DirectShow filter using
Microsoft Visual Studio 2005 and latest Windows SDK.
To give it a first try I created an empty DS filter and copied the code
from the MPEG2TransportStre
I am currently working in a project to stream DVB-T signals. To do that, I
extract the PES packets from the MPEG-TS stream and send it to the Demux
class.
Upon investigating why the Demux sometimes run out of buffer space (fMaxSize
very small), I have come to the conclusion that *the Demux class
The problem with your code is that you are feeding a
"ByteStreamFileSource" (an unstructured byte stream) directly into a
"MPEG2TransportStreamFromESSource". Instead, you should do
ByteStreamFileSource -> MPEG1or2VideoStreamFramer ->
MPEG2TransportStreamFromESSource
i.e. insert a "MPEG1
>What would I do to add a extension RTP header?? I saw there are some functions
>in MultiFramedRTPSink that seems to do this (setSpecialHeaderBytes /
>setSpecialHeaderBytes) ? Is it OK? In that case, exactly where and when have I
>to add a call to this functions for a correct implementation?
Ramon
I'll just correct my latest post:
In order to increase sender's maximum buffer size, one only needs to add this
line
OutPacketBuffer::maxSize = [the maximum size your program requires]
before any data sink is created.
I found that I had to change MAX_PACKET_SIZE variable in
MultiFramedRTPSourc
>In the FAQ, I read that to implement live streaming you have to
>created your own FramedSource subclass to encapsulate your input
>source. In essence in the deliverframe you have to copy your frame
>data to the fTO.
>
>What I don't understand is where the frame memory ( pointed to by
>the fTO
Hello,
I am having problems trying to create a Transport Stream file from
MPEG-2 Elementary Video files. I have successfully converted from
program stream using testMPEG1or2ProgramToTransportStream, and have
successfully streamed using testMPEG1or2VideoStreamer, but I would like
to send using
On the sender side, the output buffer is encapsulated wihtin the
OutPacketBuffer class (see MediaSink.cpp). In its
constructor, a buffer is allocated (called fBuf). It is *that* buffer that gets
passed to the input source. Ex: MultiFramedRTPsink class
passes the pointer to the fBuf buffer (by cal
I would love to be able to do this, but unfortunately this is just a small
part of a larger system, and I need to be able to call it with one command
line argument, so I can't use a shell pipe to do this (although I do think
that you method will be an excellent way to verify that my system as a wh
Hi!!
What would I do to add a extension RTP header?? I saw there are some functions
in MultiFramedRTPSink that seems to do this (setSpecialHeaderBytes /
setSpecialHeaderBytes) ? Is it OK? In that case, exactly where and when have I
to add a call to this functions for a correct implementation?
Tha
In the FAQ, I read that to implement live streaming you have to created your
own FramedSource subclass to encapsulate your input source. In essence in the
deliverframe you have to copy your frame data to the fTO.
What I don't understand is where the frame memory ( pointed to by the fTO
pointe
11 matches
Mail list logo