For H.264, the payload delivered in the afterGettingFrame() method is a NAL
unit. The big ones are (usually) the coded video frames (there are various
types depending on your encoder settings) and the small ones are (usually)
SPS and PPS NAL units which contain various settings needed by your
deco
Barry, Thank you very much for your reply. What you have spoken to here is exactly the model that I have followed, but the crux of the issue I am trying to solve involves exactly what you are touching on. In short what I have is: H.264 video -> RTSP -> RTSPClient -> MediaSink subclassexactly as you
I am working on an iOS project similar to what you describe: H.264 video +
AAC audio with Live555 and FFmpeg handling RTSP and video decoding
(respectively). I recommend basing your work on testRTSPClient.cpp, with
an Objective-C++ wrapper class as the interface between the rest of your
code and t
Hello,
I am reaching out to anyone out there who would be willing to give me some
guidance with some issues I'm running into getting the Live555 library
integrated into an app I'm writing. My use case is pretty simple to understand:
I am writing a mobile app (on both iOS and Android, but for