Hi, I'm facing this problem, the app which get webcam screenshot and encode them in h263 works and feed live with an UDP socket (addr 127.0.0.1 port 1234). I don't know why the bytestream seems not arriving to the live server.. Could anyone help me or give any hints? I report my server code below
#include "liveMedia.hh" #include "BasicUsageEnvironment.hh" #include "GroupsockHelper.hh" FramedSource* videoSource; void afterPlaying(void*); int main() { TaskScheduler* scheduler = BasicTaskScheduler::createNew(); UsageEnvironment*env = BasicUsageEnvironment::createNew(*scheduler); //Create our sink variables... char const* destinationAddrStr = "192.168.55.102"; const unsigned char ttl = 1; const Port rtpPort(8888); struct in_addr destinationAddress; destinationAddress.s_addr = our_inet_addr(destinationAddrStr); Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl); //Create our sink... //RTPSink* videoSink = MPEG1or2VideoRTPSink::createNew(*env, &rtpGroupsock); RTPSink* videoSink = H263plusVideoRTPSink::createNew(*env,&rtpGroupsock,56,0); if(videoSink == NULL){ *env << "Unable to create sink \n"; exit(1); } *env<<"RTPSink created\n"; //Create our source variables... char const* srcAddrStr = "127.0.0.1"; struct in_addr srcAddress; srcAddress.s_addr = our_inet_addr(srcAddrStr); const Port udpPort(1234); Groupsock udpGroupsock(*env, srcAddress, udpPort, ttl); //Create our source... FramedSource* input = BasicUDPSource::createNew(*env, &udpGroupsock); if (input == NULL) { *env << "Unable to open source \n"; exit(1); } *env<<"Input created\n"; //Create our framer... videoSource = H263plusVideoStreamFramer::createNew(*env,input); *env<<"framer created\n"; //Start to stream the data.... videoSink->startPlaying(*videoSource, afterPlaying, NULL); env->taskScheduler().doEventLoop(); *env<<"END"; return 0; } void afterPlaying(void*) { Medium::close(videoSource); } Thanks a lot for the kind help! On Thu, Jul 30, 2009 at 7:41 PM, Ross Finlayson <finlay...@live555.com>wrote: > I've an application which gets images from a webcam, then live encode them. >> I need to live stream this to a mobile device via a live555server. How can i >> manage this input stream (based on socket comm between the two apps) in >> live555?could BasicUDPSource the class which reads from the buffer? >> > > It could, if your input is UDP packets. However, it'd be far better to > have your input source be just an unstructured byte stream (i.e., a device > file, a pipe, or a TCP connection). Then you could use > "H263plusVideoStreamFramer". > > May I use H263plusVideoStreamFramer as framer for the H263 stream?? >> > > If (and only if) your input is a byte stream. If, instead, it's a sequence > of discrete frames, then you would need a new class > "H263plusVideoStreamDiscreteFramer" (which you would need to write) instead. > -- > > Ross Finlayson > Live Networks, Inc. > http://www.live555.com/ > _______________________________________________ > live-devel mailing list > live-devel@lists.live555.com > http://lists.live555.com/mailman/listinfo/live-devel >
_______________________________________________ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel