Sir I made my application in which i am getting data from a camera to a frame grabber board in which converts the .ts data and give us h264 frames and for the board i have allocated the buffers and i am taking data from the buffers and giving to live media libraries for streaming as shown below:-
#define TRANSPORT_PACKET_SIZE 188 #define TRANSPORT_PACKETS_PER_NETWORK_PACKET 7 int main() { TaskScheduler* scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Create 'groupsocks' for RTP and RTCP: char const* destinationAddressStr = "192.168.15.196"; struct in_addr destinationAddress; destinationAddress.s_addr = our_inet_addr(destinationAddressStr); //destinationAddress.s_addr = chooseRandomIPv4SSMAddress(*env); // Note: This is a multicast address. const unsigned short rtpPortNum = 18888; const unsigned short rtcpPortNum = rtpPortNum+1; const unsigned char ttl = 7; //255; const Port rtpPort(rtpPortNum); const Port rtcpPort(rtcpPortNum); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); rtpGroupsock.multicastSendOnly(); // we're a SSM source Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl); rtcpGroupsock.multicastSendOnly(); // we're a SSM source // Create an appropriate 'RTP sink' from the RTP 'groupsock': videoSink = SimpleRTPSink::createNew(*env, &rtpGroupsock, 33, 90000, "video", "MP2T", 1, True, False /*no 'M' bit*/); // Create (and start) a 'RTCP instance' for this RTP sink: const unsigned estimatedSessionBandwidth = 5000; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case *void play() {* * // Open the input file as a 'byte-stream file source':* * * * fi_params.nFICardFrameSize = TRANSPORT_PACKETS_PER_NETWORK_PACKET * TRANSPORT_PACKET_SIZE;* * fi_params.p_lm_lock_fn = lm_lock_fn;* * fi_params.p_lm_unlock_fn = lm_unlock_fn;* * * * DeviceParameters temp;* * * * fileSource = DeviceSourceFICard::createNew(*env, fi_params, temp);* * if (fileSource == NULL) {* * *env << "Unable to open file \"" << inputFileName* * << "\" as a byte-stream file source\n";* * exit(1);* * }* * FramedSource* videoES = fileSource;* * * * // Create a framer for the Video Elementary Stream:* * videoSource = MPEG1or2VideoStreamDiscreteFramer::createNew(*env, videoES);* * * * // Finally, start playing:* * *env << "Beginning to read from file...\n";* * videoSink->startPlaying(*videoSource, afterPlaying, videoSink);* * * *}* and on the server side i used your* "test On Demand RTSP Server.cpp" exe *and the code under section "* // A MPEG-2 Transport Stream, coming from a live UDP (raw-UDP or RTP/UDP) source:"* Sir now where should i make changes at streamer side or server side? On Mon, Jun 25, 2012 at 1:34 PM, Ketan Gholap <ketangholap1...@gmail.com> wrote: > * > * > ---------- Forwarded message ---------- > From: Ross Finlayson <finlay...@live555.com> > Date: Sun, Jun 24, 2012 at 2:26 PM > Subject: Re: [Live-devel] Missing sync byte! > To: LIVE555 Streaming Media - development & use <live-de...@ns.live555.com > > > > > You haven't said anything about how your application is constructed, and > what it's supposed to do, but this error message is quite specific: > > MultiFramedRTPSource::doGetNextFrame1(): The total received frame size > exceeds the client's buffer size (6). > > > This means that whatever object you're feeding your "MultiFramedRTPSource" > (subclass) object into has a (much) too small buffer size. > > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > > > _______________________________________________ > live-devel mailing list > live-devel@lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel > >
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel