Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Jeff Shanab
Security cameras keep it simple. They will not usually have bidirectionally predictive frames as that generally takes a two pass encoder and adds latency. Either way that is not the concern at the streaming level although i think live555 will reorder frames if need be. Remember that this is an RTS

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Mark Bondurant
As you surly must know, I'm a noob thrust unwillingly by circumstances into this. This is helpful. I don't need frames, just ten seconds of stream. But, doesn't H264 have a definite beginning with the following NAL packets updating the initial packet? That's what all that predictive slices and

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Jeff Shanab
Understanding the RTSPClient code is the first requirement! it succinctly and completely shows the minimum needed. Strobe is an interesting word but I get what he means, It is a pull system. The FAQ has a great explanation of this. The getNextFrame calls the source to the left(with the provided c

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Ross Finlayson
Also, part of the problem here, I think, is that you seem to be confused by what the class “H264VideoStreamFramer” does. This class takes as input an unstructured H.264 byte stream, and parses it into discrete H.264 NAL units. It *does not* combine multiple H.264 NAL units into a single ‘acces

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Ross Finlayson
> In the example program you have the ourRTSPClient with a StreamClientState > object attached. You "strobe" the session object to cause the client to pump > frames through. I don’t know what you mean here. I don’t use the word “strobe” anywhere in the code or documentation. Please stop makin

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Mark Bondurant
You need to create a filter and insert it into the chain. I had this exact scenario and what I had was my own filter that handled the incoming frames. All my frames were small POD classes with a bit of meta data and a buffer holding the frame. I had a pool of these of different sizes and they

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Mark Bondurant
Sorry. That wasn't clear. Yes, FramedSource derives from MediaSource, which derives from Medium. RTSPClient derives from Medium. In the example program you have the ourRTSPClient with a StreamClientState object attached. You "strobe" the session object to cause the client to pump frames through

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Jeff Shanab
You need to create a filter and insert it into the chain. I had this exact scenario and what I had was my own filter that handled the incoming frames. All my frames were small POD classes with a bit of meta data and a buffer holding the frame. I had a pool of these of different sizes and they were

Re: [Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Ross Finlayson
> I need to keep a constant 3 second buffer of an H264 video stream. It's for > security cameras. When something trips the camera, I replay the 3 seconds and > then 6 more to see the event (One hopes I'll catch some ghosts or mountain > lions, but really it's to catch car thieves!). With mpeg i

[Live-devel] Buffering an H264 Video Stream

2014-11-03 Thread Mark Bondurant
Hello, Sorry if this is a repeat, but I/we have constant email problems (political issues), which I'm fairly sure I've now found a workaround for. What I'm saying is that this may be a repeat. If it is, I apologize, I didn't get your responses. I need to keep a constant 3 second buffer of an H