Re: [Live-devel] Need help: RTSP Stream -> video render

2012-02-27 Thread Barry Stump
For H.264, the payload delivered in the afterGettingFrame() method is a NAL unit. The big ones are (usually) the coded video frames (there are various types depending on your encoder settings) and the small ones are (usually) SPS and PPS NAL units which contain various settings needed by your deco

Re: [Live-devel] Need help: RTSP Stream -> video render

2012-02-27 Thread Brad O'Hearne
Barry, Thank you very much for your reply. What you have spoken to here is exactly the model that I have followed, but the crux of the issue I am trying to solve involves exactly what you are touching on. In short what I have is: H.264 video -> RTSP -> RTSPClient -> MediaSink subclassexactly as you

Re: [Live-devel] Need help: RTSP Stream -> video render

2012-02-27 Thread Barry Stump
I am working on an iOS project similar to what you describe: H.264 video + AAC audio with Live555 and FFmpeg handling RTSP and video decoding (respectively). I recommend basing your work on testRTSPClient.cpp, with an Objective-C++ wrapper class as the interface between the rest of your code and t

[Live-devel] Need help: RTSP Stream -> video render

2012-02-24 Thread Brad O'Hearne
Hello, I am reaching out to anyone out there who would be willing to give me some guidance with some issues I'm running into getting the Live555 library integrated into an app I'm writing. My use case is pretty simple to understand: I am writing a mobile app (on both iOS and Android, but for