Ok I have performed additional tests and it seems that not all JPEG
images can be transferred with RFC 2435
That's correct. As the RFC explains, this payload format handles
only a subset of the full JPEG standard.
We can see that test2_out.jpg is perfectly reconstructed (even if
there are m
"JPEGVideoRTPSink" (which is used by the RTP sender) doesn't
reconstruct the JPEG header. (Remember that the JPEG header is not
sent in the RTP packet.) It's the RTP *receiver*
("JPEGVideoRTPSource" in our case) that reconstructs a JPEG header
(from the parameters in the RTP payload forma
If I don't use live555 it works just fine: i.e. encode to a memory
block, decode from a memory block. I tried comparing the original
jpeg compressed data with the received data and it seems that the
jpeg header is broken. Could it be that JPEGVideoRTPSink fails to
reconstruct the original JP
Don't forget to also set "fPresentationTime".
Ok I am setting it properly now, using gettimeofday(), however why
is it needed in the case of a jpeg frame?
Because you're not streaming just one frame; you're streaming a
sequence of frames, and the receiver needs to know when to display
each on
Don't forget to also set "fPresentationTime".
Ok I am setting it properly now, using gettimeofday(), however why is it
needed in the case of a jpeg frame? Also, since my code runs on a
windows platform, I had to download a custom implementation of this
function.
I can't help you with codec
I am trying to stream real-time video data over RTP using live555
library with MJPEG format. To to so, I created a derived class of
JPEGVideoSource. Within the derived class, I have implemented the
following methods:
a) The doGetNextFrame() method which dequeue an RGB video frame,
converts it
Hello,
I am trying to stream real-time video data over RTP using live555
library with MJPEG format. To to so, I created a derived class of
JPEGVideoSource. Within the derived class, I have implemented the
following methods:
a) The doGetNextFrame() method which dequeue an RGB video frame,
co