> > While the H264LiveServerMediaSession constructor is called once - the
> > createNewRTPSink/createNewStreamSource are called twice (?!) upon RTSP
> > connection. May be this is the real reason of 
> > "FramedSource[0x1bf42e0]::getNextFrame():
> > attempting to read more" message ?
>
> No, I don?t think so, because between the two calls to 
> ?createNewStreamSource()?, there should be a call to 
> ?~H264VideoStreamDiscreteFramer()? (i.e., to close the object that was 
> created by the first "createNewStreamSource()?.  This is the case because you 
> set the ?reuseFirstSource? parameter to True in your call to the 
> ?OnDemandServerMediaSubsession? constructor.

 Ok then how to fix/debug further this message and SIGSEGV ? Seems like
nobody is using the source twice but the bug is still present.

> >> If you know the SPS and PPS NAL units ahead of time, then you can add
> >> them as (otherwise optional) parameters to the call to
> >> ?H264VideoRTPSink::createNew()? in ?playCommon.cpp?.
> >
> Are you *sure* that there was no "sprop-parameter-sets=? assignment in a
> "a=fmtp:? line in the SDP description returned by SIP?  If there was,
> then it should end up in the ?MediaSubsession? object.

Definitely sure as the playSIP prints all the SIP messages to the stderr:

...
*** continueAfterDESCRIBE
Opened URL "sip:100@10.1.1.208", returning a SDP description:
v=0
o=- 2849339286 1 IN IP4 10.1.1.208
s=playSIP session
c=IN IP4 10.1.1.208
t=0 0
m=audio 8010 RTP/AVP 0
m=video 8000 RTP/AVP 99
b=AS:224
a=rtpmap:99 H264/90000
a=fmtp:99 profile-level-id=42801F; packetization-mode=1
....

As I guess the sprop-parameter-sets is used by the livecams only.

> But otherwise, your RTSP server is not going to work properly (or, more
> accurately, it will work, but clients won?t be able to play the stream
> because they won?t know the SPS and PPS NAL units before they start
> playing the stream).  Is it the case that the SPS and PPS NAL units
> don?t appear in the SIP SDP, but instead appear inline (e.g.,
> periodically) in the H.264 stream?  If that?s the case, then you?ll have
> to implement the ?getAuxSDPLine()? virtual function in your
> ?H264LiveServerMediaSession? class.  For guidance on how to do this, see
> how we implement the ?H264VideoFileServerMediaSubsession? class.

Ok, will try to implement the same in H264LiveServerMediaSession

Thanks, Rus

>
>
> Ross Finlayson
> Live Networks, Inc.
> http://www.live555.com/
>
>
> _______________________________________________
> live-devel mailing list
> live-devel@lists.live555.com
> http://lists.live555.com/mailman/listinfo/live-devel
>
_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to