Hi all,

I may ask a very simple question, but I couldn't manage to figure it out by
my self as I don't know livemedia library arch deeply. I was wondering if
there is any kind of frame buffer implementation or this is out of
livemedia scope.

I am thinking of moving my code to livemedia555 for an RTP streaming app
(receiver and sender sides), but one of my major concerns goes about the
buffer, where packets are stored (after being parsed, imagine h264 payload)
and where coded frames are stored (a collection of packets). At which
classes should I have a look to understand how the buffers works?

I Saw FrameSource and MediaSoure or RTPSource but I couldn't manage to
undestand if they actually work as a frame buffer, as I could not
understand which data units they manage.

Thanks in advance.

Kind Regards,
David
_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to