Hello,

I'm creating a live streaming library and I have problems with setting up
h264 video stream. I implemented a MediaSource class to wrap MS Media
Foundation H264 encoder and plugged it into my custom subsession, like this:

CIMediaSource* source = CIMediaSource::createNew(envir(), FrameProvider);
UINT32 cbSpsSize, cbPpsSize;
UINT8* pcbSpsBuffer = NULL;
UINT8* pcbPpsBuffer = NULL;
source->GetSPSandPPS(&cbSpsSize, &pcbSpsBuffer, &cbPpsSize, &pcbPpsBuffer);

H264VideoStreamDiscreteFramer* framer =
H264VideoStreamDiscreteFramer::createNew(envir(), source);
framer->setSPSandPPS(pcbSpsBuffer, cbSpsSize, pcbPpsBuffer, cbPpsSize);

The problem is that my SPS/PPS NALs are never used in SDP. After some
debugging i reached the place where Session Description is created in
H264VideoRTPSink:

 if (sps == NULL || pps == NULL) {
    // We need to get SPS and PPS from our framer source:
    if (fOurFragmenter == NULL) return NULL; // we don't yet have a
fragmenter (and therefore not a source)
    H264VideoStreamFramer* framerSource = (H264VideoStreamFramer*)
(fOurFragmenter->inputSource());
    if (framerSource == NULL) return NULL; // we don't yet have a source

    framerSource->getSPSandPPS(sps, spsSize, pps, ppsSize);
    if (sps == NULL || pps == NULL) return NULL; // our source isn't ready
  }


Here, framerSource->getSPSandPPS() never gets called because at this point
fOurFragmenter  is NULL.

On the other hand, when I create RTSP sink and then pass SPS/PPS, the stream
works OK:

return H264VideoRTPSink::createNew(envir(), rtpGroupsock,
rtpPayloadTypeIfDynamic, SPS, SPSSize, PPS, PPSSize); //THIS ONE WORKS


Is there anything I'm missing with the setup? How can I properly set SPS/PPS
NALs for the stream?

Any help appreciated!

Best regards,
Pawel

_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to