I'm sure this is this is a lame question as you all are familiar with live555 
and this seems every day simple, but to me it seems somewhat insurmountable. 
What makes a framed source a framed source? If it's a FramedSource, then why 
does it need a framer? Perhaps it's more of a "Frame-able Source" and an H264 
RSTP stream isn't? But if it isn't, why does everyone say it's simple to do?

From: live-devel [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Mark 
Bondurant
Sent: Tuesday, November 04, 2014 11:16 AM
To: 'LIVE555 Streaming Media - development & use'
Subject: Re: [Live-devel] Buffering an H264 Video Stream

I get this part now, but what I still don't get is how to pass incoming data 
from RSTPClient to H264VideoStreamFramer. The constructor for it is:

H264VideoStreamDiscreteFramer::createNew(UsageEnvironment& env, FramedSource* 
inputSource) {

A FramedSource input, but RSTPClient is not a FramedSource. Don't see any 
framed source in it.

From: live-devel [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross 
Finlayson
Sent: Monday, November 03, 2014 3:35 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Buffering an H264 Video Stream

Also, part of the problem here, I think, is that you seem to be confused by 
what the class “H264VideoStreamFramer” does.  This class takes as input an 
unstructured H.264 byte stream, and parses it into discrete H.264 NAL units.  
It *does not* combine multiple H.264 NAL units into a single ‘access unit’ 
(i.e., picture).  If you want to do this (or any other processing on the 
incoming H.264 NAL units), then you’ll need to do this yourself.  The LIVE555 
libraries do not contain any video ‘codec’ (i.e., decoding or encoding) 
functionality.
Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to