Re: [Live-devel] Live555 Library build for IOS 5.1
Any reply on this?? - Original Message - From: To: Sent: Thursday, April 19, 2012 10:17 PM Subject: [Live-devel] Live555 Library build for IOS 5.1 Hi, I m trying to build Live 555 for IOS 5.1 I have updated config.iphoneos file for IOS 5.1.(config.iphoneos attached) But I am getting following make error /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/arm-apple-darwin10-llvm-g++-4.2 -c -Iinclude -I../UsageEnvironment/include -I../groupsock/include -I. -DBSD=1 -O2 -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1 -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 -fPIC -arch armv7 --sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS5.1.sdk -Wall Media.cpp cc1plus: error: unrecognized command line option "-arch" make[1]: *** [Media.o] Error 1 make: *** [all] Error 2 What may be the issue? Thanks in advance ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] Live555 Library build for IOS 5.1
Hi, I have updated config.iphoneos (config.iphoneos attached) and able to build static libraries for iphoneos. Regards, KP - Original Message - From: "Krishna" To: "LIVE555 Streaming Media - development & use" Sent: Monday, April 23, 2012 10:03 AM Subject: Re: [Live-devel] Live555 Library build for IOS 5.1 Any reply on this?? - Original Message - From: To: Sent: Thursday, April 19, 2012 10:17 PM Subject: [Live-devel] Live555 Library build for IOS 5.1 Hi, I m trying to build Live 555 for IOS 5.1 I have updated config.iphoneos file for IOS 5.1.(config.iphoneos attached) But I am getting following make error /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/arm-apple-darwin10-llvm-g++-4.2 -c -Iinclude -I../UsageEnvironment/include -I../groupsock/include -I. -DBSD=1 -O2 -DSOCKLEN_T=socklen_t -DHAVE_SOCKADDR_LEN=1 -D_LARGEFILE_SOURCE=1 -D_FILE_OFFSET_BITS=64 -fPIC -arch armv7 --sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS5.1.sdk -Wall Media.cpp cc1plus: error: unrecognized command line option "-arch" make[1]: *** [Media.o] Error 1 make: *** [all] Error 2 What may be the issue? Thanks in advance ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel config.iphoneos Description: Binary data ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] Streaming over Wifi with no receiver
Hi Ross, We are using Live555 library for H264 video streaming over Wi-Fi. But If transmitter is disconnected or turned off, How Receiver will come to know? Is there any events /signals to get that information. We used TestRTSPClient file a a reference on receiver side. Regards, KP___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] FrameSource:Getnextframe error while streaming PCM frames
Hi Ross, I have problems streaming live PCM audio. Audio comes either directly from microphone (16-bit LE) Sampling frequency is 8k, mono(1 channel). I receive a buffer in the thread and use event trigger to signal my live555 thread. I've created class based on DeviceSource that inherit from AudioInputDevice and delivers the Frame on trigger. I am using uLawFromPCMAudioSource to convert to 8-bit u-law audio I am getting following error if I am giving audio format as WA_PCM: FramedSource ::getNextFrame():attempting to read more than once at the same time. One thing I observed here is FramedSource::getNextFrame is getting called twice at a time( uLawFromPCMAudioSource is calling it again) If I change audio format to WA_PCMU, I am able to stream without any error ( As FramedSource::getNextFrame is getting called once at a time), and VLC also able to play with some noise. Where I am going wrong ? Thanks in advance Regards, Krishna ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] FrameSource:Getnextframe error while streaming PCMframes
Hi Ross, I found the problem that uLawFromPCMAudioSource afterGettingFrame is not getting called when I use DeviceSource based design and triggering concept. i.e. If I am calling FramedSource::afterGetting(this) in doGetNextFrame itself , it is calling afterGettingFrame function in uLawFromPCMAudioSource followed by calling afterGettingFrame function in MultiFramedRTPSink. If I am calling FramedSource::afterGetting(this) in deliverFrame(which will called by trigger event), then it is calling only afterGettingFrame function in MultiFramedRTPSink and not uLawFromPCMAudioSource afterGettingFrame function. That's why I am getting FramedSource ::getNextFrame():attempting to read more than once at the same time. Where I am going wrong? Can you please help on that? Thanks in advance From: Krishna Sent: Friday, October 11, 2013 12:40 PM To: live-de...@ns.live555.com Subject: [Live-devel] FrameSource:Getnextframe error while streaming PCMframes Hi Ross, I have problems streaming live PCM audio. Audio comes either directly from microphone (16-bit LE) Sampling frequency is 8k, mono(1 channel). I receive a buffer in the thread and use event trigger to signal my live555 thread. I've created class based on DeviceSource that inherit from AudioInputDevice and delivers the Frame on trigger. I am using uLawFromPCMAudioSource to convert to 8-bit u-law audio I am getting following error if I am giving audio format as WA_PCM: FramedSource ::getNextFrame():attempting to read more than once at the same time. One thing I observed here is FramedSource::getNextFrame is getting called twice at a time( uLawFromPCMAudioSource is calling it again) If I change audio format to WA_PCMU, I am able to stream without any error ( As FramedSource::getNextFrame is getting called once at a time), and VLC also able to play with some noise. Where I am going wrong ? Thanks in advance Regards, Krishna ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] FrameSource:Getnextframe error while streamingPCMframes
Hi Ross, I have attached 1. my Device source file Wavsource.cpp 2. WaveStreamer .cpp( took a reference from testWavAudioStreamer.cpp) where I have thread to read the samples and have code for initialization and starting the session. Regards From: Ross Finlayson Sent: Thursday, October 24, 2013 6:23 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] FrameSource:Getnextframe error while streamingPCMframes I found the problem that uLawFromPCMAudioSource afterGettingFrame is not getting called when I use DeviceSource based design and triggering concept. i.e. If I am calling FramedSource::afterGetting(this) in doGetNextFrame itself , it is calling afterGettingFrame function in uLawFromPCMAudioSource followed by calling afterGettingFrame function in MultiFramedRTPSink. If I am calling FramedSource::afterGetting(this) in deliverFrame(which will called by trigger event), then it is calling only afterGettingFrame function in MultiFramedRTPSink and not uLawFromPCMAudioSource afterGettingFrame function. That's why I am getting FramedSource ::getNextFrame():attempting to read more than once at the same time. Where I am going wrong? I can't tell what's wrong, without seeing your code. Please post the code for your "OnDemandServerMediaSubsession" subclass, and for your "DeviceSource" based class. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel #include "liveMedia.hh" #include "GroupsockHelper.hh" #include "BasicUsageEnvironment.hh" #include "pthread.h" #include "WAVSource.hh" // To convert 16-bit samples to 8-bit u-law ("u" is the Greek letter "mu") // encoding, before streaming, uncomment the following line: #define CONVERT_TO_ULAW 1 UsageEnvironment* env; void play(); // forward pthread_t ThreadID; TaskScheduler* scheduler; WAVSource* wavSource; unsigned char audioFormat; unsigned char bitsPerSample; unsigned samplingFrequency; unsigned char numChannels; unsigned bitsPerSecond; char const* mimeType="PCMU"; unsigned char payloadFormatCode; uLawFromPCMAudioSource* uLawsource; struct sessionState_t { FramedSource* source; RTPSink* sink; RTCPInstance* rtcpInstance; Groupsock* rtpGroupsock; Groupsock* rtcpGroupsock; RTSPServer* rtspServer; } sessionState; void triggerLive555Scheduler(void) { scheduler->triggerEvent(WAVSource::s_frameReceivedTrigger, sessionState.source); } /** Thread function to read the PCM samples from teh device & trigger live555 thread. **/ void* StartStreaming(void* p) { static bool s_bFirstInstance = true; // Start the streaming: *env << "Beginning streaming...\n"; //play(); s_bFirstInstance = false; bool m_bStreamingEnable = true; while(true == m_bStreamingEnable) { /* get encoded the frames from the camera */ if(true == wavSource->doReadFromDriver()) { /* if capture frame is success, trigger the live555 thread for initiate transfer */ triggerLive555Scheduler(); } else { printf("Source Capture frame failedGo back to Wait state\r\n"); /* Disable streaming on error conditions */ m_bStreamingEnable = false; /* Signal the main routine to re-initiate the session */ break; } usleep(3); } } int main(int argc, char** argv) { // Begin by setting up our usage environment: scheduler = BasicTaskScheduler::createNew(); env = BasicUsageEnvironment::createNew(*scheduler); // Get attributes of the audio source: audioFormat = WAVSource::getAudioFormat(); printf("audioFormat:%d",audioFormat); bitsPerSample = WAVSource::bitsPerSample(); samplingFrequency = WAVSource::samplingFrequency(); numChannels = WAVSource::numChannels(); bitsPerSecond = samplingFrequency*bitsPerSample*numChannels/2; *env << "Audio source parameters:\n\t" << samplingFrequency << " Hz, "; *env << bitsPerSample << " bits-per-sample, "; *env << numChannels << " channels => "; *env << bitsPerSecond << " bits-per-second\n"; payloadFormatCode = 96; // by default, unless a static RTP payload type can be used // Create (and start) a 'RTCP instance' for this RTP sink: const unsigned estimatedSessionBandwidth = (bitsPerSecond + 500)/1000; // in kbps; for RTCP b/w share const unsigned maxCNAMElen = 100; unsigned char CNAME[maxCNAMElen+1]; gethostname((char*)CNAME, maxCNAMElen); CNAME[maxCNAMElen] = '\0'; // just in case struct in_addr destinationAddress; destinationAddress.s_addr
Re: [Live-devel] FrameSource:Getnextframe error whilestreamingPCMframes
HI Ross, Thanks. Now afterGettingFrame function is getting called. Regards, From: Ross Finlayson Sent: Friday, October 25, 2013 3:31 AM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] FrameSource:Getnextframe error whilestreamingPCMframes I think your problem is here: void triggerLive555Scheduler(void) { scheduler->triggerEvent(WAVSource::s_frameReceivedTrigger, sessionState.source); } The problem with this is the second parameter to "triggerEvent()". It needs to be a pointer to a "WAVSource" object. If you are streaming raw PCM audio, and inserting a "uLawFromPCMAudioSource" filter object in front of it, then "sessionState.source" will point to that filter object, which is the wrong thing to be passing to "triggerEvent()". So, you should change the second parameter to be "wavSource". Ross Finlayson Live Networks, Inc. http://www.live555.com/ ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] Audio/Video Sink - Video stops in between
HI Ross, I have a problem in Audio/Video sink. Video comes from camera and encoding to H.264 and audio comes directly from microphone (PCM 16-bit LE, Sampling frequency is 8k, mono(1 channel)). We are able to get Video, Audio sink with the minimal delay. Now the problem is Sometimes Video stops for a while (for some 500ms; I guess to make Sink) which I need to avoid. How I can avoid that? Regards, Krishna ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] Audio/Video Sink - Video stops in between
Hi Ross, I confirmed that Video stop is not because of frame loss. I checked in VLC statistics. Is that can be bandwidth problem? Estimated bandwidth for audio is 64 kbps(derived from (bitsPerSecond + 500)/1000) Estimated bandwidth for video is 500 kbps Thanks in advance, Krishna From: Krishna Sent: Monday, November 18, 2013 1:22 PM To: LIVE555 Streaming Media - development & use Subject: Audio/Video Sink - Video stops in between HI Ross, I have a problem in Audio/Video sink. Video comes from camera and encoding to H.264 and audio comes directly from microphone (PCM 16-bit LE, Sampling frequency is 8k, mono(1 channel)). We are able to get Video, Audio sink with the minimal delay. Now the problem is Sometimes Video stops for a while (for some 500ms; I guess to make Sink) which I need to avoid. How I can avoid that? Regards, Krishna ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] Vivatek digest authentication
Hi, Live555 version 2012.02.29 is not compatible with Vivotek camera digest authentication. This is what Vivotek IP8161 camera sends when it cannot authrorize the client: - Rtsp: RESPONSE, RTSP/1.0, Status Code = 401 - Unauthorized - Response: Status of response : Unauthorized ProtocolVersion: RTSP/1.0 StatusCode: 401, Unauthorized Reason: Unauthorized CSeq: 3 WWW-Authenticate: Digest qop="auth",realm="streaming_server",nonce="bfd4e04b78959b55aeb1167adfabcec5" HeaderEnd: CRLF WWW-Authenticate response seems to be diferent from what Axis or Sony cameras send. Thanks, Krishna. ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] 2013.04.21 crash bug is not fixed
Hi,Crash bug introduced in version 2013.04.21 is not fixed in 2013.04.29. Occasionally at session shutdown SocketDescriptor object still gets accessed after it was deleted. Version 2013.04.16 does not have such problem. Here's some info: Exception info:EXCEPTION_RECORD: -- (.exr 0x) ExceptionAddress: 101041db (RTSPRec!SocketDescriptor::tcpReadHandler+0x008b) ExceptionCode: c005 (Access violation) ExceptionFlags: NumberParameters: 2 Parameter[0]: Parameter[1]: Attempt to read from address Crash stack:ChildEBP RetAddr Args to Child 03a3f65c 1012525e 0478fc10 0002 88e18a61 RTSPRec!SocketDescriptor::tcpReadHandler+0x8b 03a3fa08 1012759b 03a3fa28 047838f8 RTSPRec!BasicTaskScheduler::SingleStep+0x72e 03a3fa1c 100ac67e 04783729 03a3fbe8 03a3fb0c RTSPRec!BasicTaskScheduler0::doEventLoop+0x3b 03a3fb04 100a82f8 88e18b91 03a3fcf4 0050ca04 RTSPRec!CRTSPHandler::ShutdownSession+0xae 03a3fbf8 100a1c6d 03a3fdf8 0050ca04 0050ca68 RTSPRec!CRTSPHandler::Uninit+0xa8 03a3fcf4 1009ffcf 2000 0050ca04 0050ca68 RTSPRec!CRTSPReceiver::UninitRTSP+0x3d 03a3fdf8 672724fa 04782ff0 0260 0050ce44 RTSPRec!CRTSPReceiver::Stop+0x6f SocketDescriptor::tcpReadHandler local members:0:022:x86> dv socketDescriptor = 0x0478fc10 mask = 0n2 count = 0x7d0 SocketDescriptor class members:0:022:x86> ?? socketDescriptor class SocketDescriptor * 0x0478fc10 +0x000 __VFN_table : 0x +0x004 fEnv : 0x UsageEnvironment +0x008 fOurSocketNum: 0n-572662307 +0x00c fSubChannelHashTable : 0x HashTable +0x010 fServerRequestAlternativeByteHandler : 0x void + +0x014 fServerRequestAlternativeByteHandlerClientData : 0x Void +0x018 fStreamChannelId : 0xdd '' +0x019 fSizeByte1 : 0xdd '' +0x01a fReadErrorOccurred : ffdd +0x01b fDeleteNext : ffdd +0x01c fTCPReadingState : 0x (No matching name) socketDescriptor is toast void SocketDescriptor::tcpReadHandler(SocketDescriptor* socketDescriptor, int mask) { // Call the read handler until it returns false, with a limit to avoid starving other sockets unsigned count = 2000; while (!socketDescriptor->fDeleteNext && socketDescriptor->tcpReadHandler1(mask) && --count > 0) {} if (socketDescriptor->fDeleteNext) delete socketDescriptor; <- Crash here } Krishna. ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] 2013.04.21 crash bug is not fixed
It's fixed. Thanks! From: finlay...@live555.com Date: Tue, 30 Apr 2013 15:18:36 -0700 To: live-de...@ns.live555.com Subject: Re: [Live-devel] 2013.04.21 crash bug is not fixed Crash bug introduced in version 2013.04.21 is not fixed in 2013.04.29. Thanks for the report. Please download a new version 2013.04.30 which, I hope, should fix this for real. Ross Finlayson Live Networks, Inc. http://www.live555.com/ ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] H264 dropped P frames
I have no problems receiving 150+Kb K-frames over UDP with Live555. What I do is increase socket receive buffer and make sure I don't block CUMediaSink::continuePlaying() for too long. In fact, all my implementation of CUMediaSink::continuePlaying() does is move received sample into my private queue. All sample processing is done on separate thread. Krishna.From: jsha...@smartwire.com To: live-de...@ns.live555.com Date: Thu, 2 May 2013 21:37:30 + Subject: Re: [Live-devel] H264 dropped P frames Thanks. I do understand how the event model works (Quite eloquent BTW). The fact that it throws away complete NAL units if a piece of a fragment has loss explains why it appears as dropping NALs. Here is the only part I cannot figure out. Why does it not lose any packets if I use OpenRtsp on the command line.Why does VLC have no problem with the video live. Almost like the sender is not following standard flow control and only a simple flat out copy will grab it fast enough. This is not high resolution or large NALS.21k for a key frame 1K-2K for the P-Frames 30fps with a GOP size that varies but averages 38. My class is almost identical to the PLayCommon.cpp + OpenRtsp.cpp code. I have a memorySink, filter, and async file writing queue. This message and any attachments contain confidential and proprietary information, and may contain privileged information, belonging to one or more affiliates of Windy City Wire Cable & Technology Products, LLC. No privilege is waived by this transmission. Unauthorized use, copying or disclosure of such information is prohibited and may be unlawful. If you receive this message in error, please delete it from your system, destroy any printouts or copies of it, and notify the sender immediately by e-mail or phone. From: live-devel-boun...@ns.live555.com [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson Sent: Thursday, May 02, 2013 4:04 PM To: LIVE555 Streaming Media - development & use Subject: Re: [Live-devel] H264 dropped P frames Could it be that the packets are incorrect and live555 cannot find the beginning and end, so it doesn’t clear out the buffer? Then it hits the max, dumps it and starts over. No, but note that if - as with all payload formats - if one frame (in this case, one H.264 NAL unit) is fragmented over multiple RTP packets, then if any one of these RTP packets gets lost, then the entire NAL unit will get discarded. That's why - yet again - H.264 encoders should not create excessively large NAL units. In any case, everyone (and especially you :-) needs to read and understand this sentence that's at the end of the FAQ entry: ** It's important to understand that because a LIVE555 Streaming Media application runs as a single thread (never writing to, or reading from, sockets concurrently), if packet loss occurs, then it MUST be happening either (i) on the network, or (ii) in the operating system of the sender or receiver. There's nothing in our code that can be 'losing' packets. ** Ross Finlayson Live Networks, Inc. http://www.live555.com/ ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] Bad "Range:" header error introduced
Hi, I switched from Live555 version 2013.09.08 to 2014.02.04 and PLAY command sent to Axis 213 camera now results in "Bad "Range:" header" error returned by Live555. "Range: npt=now-" is returned by the camera that seems to get rejected. The camera is on-line and can be accessed via HTTP tunneling: rtsp://128.197.178.101/mpeg4/media.amp. Version 2013.09.08 worked just fine. Thanks, Krishna. ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
Re: [Live-devel] Bad "Range:" header error introduced
bnB0PTAuMDAwLQ0KDQo= Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Session: 1767047643 Range: npt=now- RTP-Info: url=trackID=1;seq=34609;rtptime=3603019675 Failed to start playing session: Bad "Range:" header Sending request: TEARDOWN rtsp://128.197.178.101:554/mpeg4/media.amp/ RTSP/1.0 CSeq: 6 User-Agent: openRTSP (LIVE555 Streaming Media v2014.02.04) Session: 1767047643 The request was base-64 encoded to: VEVBUkRPV04gcnRzcDovLzEyOC4xOTcuMTc4 LjEwMTo1NTQvbXBlZzQvbWVkaWEuYW1wLyBSVFNQLzEuMA0KQ1NlcTogNg0KVXNlci1BZ2VudDogb3Bl blJUU1AgKExJVkU1NTUgU3RyZWFtaW5nIE1lZGlhIHYyMDE0LjAyLjA0KQ0KU2Vzc2lvbjogMTc2NzA0 NzY0Mw0KDQo= Krishna. From: finlay...@live555.com Date: Sat, 8 Feb 2014 10:39:34 +1300 To: live-de...@ns.live555.com Subject: Re: [Live-devel] Bad "Range:" header error introduced I switched from Live555 version 2013.09.08 to 2014.02.04 and PLAY command sent to Axis 213 camera now results in "Bad "Range:" header" error returned by Live555. "Range: npt=now-" is returned by the camera that seems to get rejected. The camera is on-line and can be accessed via HTTP tunneling: rtsp://128.197.178.101/mpeg4/media.amp. That's odd. I'm not seeing this at all. Running "openRTSP -T 80" (to specify RTSP-over-HTTP tunneling) on this URL works just fine: %openRTSP -T 80 rtsp://128.197.178.101/mpeg4/media.amp Opening connection to 128.197.178.101, port 80... ...remote connection opened Requesting RTSP-over-HTTP tunneling (on port 80) Sending request: GET /mpeg4/media.amp HTTP/1.1 CSeq: 1 User-Agent: ./openRTSP (LIVE555 Streaming Media v2014.02.07) Host: 128.197.178.101 x-sessioncookie: 11828aef671cfcf975c137d Accept: application/x-rtsp-tunnelled Pragma: no-cache Cache-Control: no-cache Received 63 new bytes of response data. Received a complete GET response: HTTP/1.0 200 OK Content-Type: application/x-rtsp-tunnelled Opening connection to 128.197.178.101, port 80... ...remote connection opened Sending request: POST /mpeg4/media.amp HTTP/1.1 CSeq: 1 User-Agent: ./openRTSP (LIVE555 Streaming Media v2014.02.07) Host: 128.197.178.101 x-sessioncookie: 11828aef671cfcf975c137d Content-Type: application/x-rtsp-tunnelled Pragma: no-cache Cache-Control: no-cache Content-Length: 32767 Expires: Sun, 9 Jan 1972 00:00:00 GMT Sending request: OPTIONS rtsp://128.197.178.101/mpeg4/media.amp RTSP/1.0 CSeq: 2 User-Agent: ./openRTSP (LIVE555 Streaming Media v2014.02.07) The request was base-64 encoded to: T1BUSU9OUyBydHNwOi8vMTI4LjE5Ny4xNzguMTAxL21wZWc0L21lZGlhLmFtcCBSVFNQLzEuMA0KQ1NlcTogMg0KVXNlci1BZ2VudDogLi9vcGVuUlRTUCAoTElWRTU1NSBTdHJlYW1pbmcgTWVkaWEgdjIwMTQuMDIuMDcpDQoNCg== Received 91 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK CSeq: 2 Public: DESCRIBE, GET_PARAMETER, PAUSE, PLAY, SETUP, TEARDOWN Sending request: DESCRIBE rtsp://128.197.178.101/mpeg4/media.amp RTSP/1.0 CSeq: 3 User-Agent: ./openRTSP (LIVE555 Streaming Media v2014.02.07) Accept: application/sdp The request was base-64 encoded to: REVTQ1JJQkUgcnRzcDovLzEyOC4xOTcuMTc4LjEwMS9tcGVnNC9tZWRpYS5hbXAgUlRTUC8xLjANCkNTZXE6IDMNClVzZXItQWdlbnQ6IC4vb3BlblJUU1AgKExJVkU1NTUgU3RyZWFtaW5nIE1lZGlhIHYyMDE0LjAyLjA3KQ0KQWNjZXB0OiBhcHBsaWNhdGlvbi9zZHANCg0K Received 823 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 3 Content-Base: rtsp://128.197.178.101:554/mpeg4/media.amp/ Content-Type: application/sdp Content-Length: 684 v=0 o=- 1391790981113890 1391790981113897 IN IP4 128.197.178.101 s=Media Presentation e=NONE c=IN IP4 0.0.0.0 b=AS:8000 t=0 0 a=control:* a=range:npt=now- a=mpeg4-iod: "data:application/mpeg4-iod;base64,AoDUAE8BAf/1AQOAbwABQFBkYXRhOmFwcGxpY2F0aW9uL21wZWc0LW9kLWF1O2Jhc2U2NCxBUjBCR3dVZkF4Y0F5U1FBWlFRTklCRUVrK0FBZWhJQUFIb1NBQVlCQkE9PQQNAQUABAYJAQAAAzoAAkA2ZGF0YTphcHBsaWNhdGlvbi9tcGVnNC1iaWZzLWF1O2Jhc2U2NCx3QkFTWVFTSVVFVUZQd0E9BBICDQAAAgAABQMAAEAGCQEAAA==" m=video 0 RTP/AVP 96 b=AS:8000 a=control:trackID=1 a=rtpmap:96 MP4V-ES/9 a=fmtp:96 profile-level-id=245; config=01B0F501B50901000120008C4019285820F0A21F; a=mpeg4-esid:201 Opened URL "rtsp://128.197.178.101/mpeg4/media.amp", returning a SDP description: v=0 o=- 1391790981113890 1391790981113897 IN IP4 128.197.178.101 s=Media Presentation e=NONE c=IN IP4 0.0.0.0 b=AS:8000 t=0 0 a=control:* a=range:npt=now- a=mpeg4-iod: "data:application/mpeg4-iod;base64,AoDUAE8BAf/1AQOAbwABQFBkYXRhOmFwcGxpY2F0aW9uL21wZWc0LW9kLWF1O2Jhc2U2NCxBUjBCR3dVZkF4Y0F5U1FBWlFRTklCRUVrK0FBZWhJQUFIb1NBQVlCQkE9PQQNAQUABAYJAQAAAzoAAkA2ZGF0YTphcHBsaWNhdGlvbi9tcGVnNC1iaWZzLWF1O2Jhc2U2NCx3QkFTWVFTSVVFVUZQd0E9BBICDQAAAgAABQMAAEAGCQEAAA==" m=video 0 RTP/AVP 96 b=AS:8000 a=control:trackID=1 a=rtpmap:96 MP4V-ES/9 a=fmtp:96 profile-level-id=245; config=01B0F501B50901000120008C4019285820F0A21F; a=mpeg4-esid:201
[Live-devel] Wowza Streaming Engine compatibility problem
Hi, Live555 client drops connection after 90 seconds of receiving RTSP stream from Wowza Streaming Server 4.0.6 over HTTP tunneling protocol. Socket operation fail with WSAECONNABORTED error (An established connection was aborted by the software in your host machine.) Receiving the stream over TCP works just fine. This behavior can be seen with openRTSP v2014.10.07: openRTSP -T 80 rtsp://q1604.dnsdojo.com:1935/live/sys1.stream Thanks, Krishna. ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] WebRTC
Dear sir/Madam, I would like to develop an application based on Live555 WebRTC fundamentals. Please do share the required material. regards, Bala Krishna, Software Developer, Cubical Laboratories New Delhi, India ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] Help in live555 codebase execution
Title: Samsung Enterprise Portal mySingle Hi , I am trying to create a setup for remote streaming. My Scenario is like below, Device1(Camera) ---> NAT(Router) -> Internet ---> Device2. Device2 wants to initiate a streaming request to Device1 using openRTSPclient. It is not possible to make since Device1 is in private network. I googled it, I found it is possible using -R option in openRTSPclient with proxyserver setup. I found this link http://lists.live555.com/pipermail/live-devel/2013-June/017112.html, I am trying to use the registerRTSPStream/live555ProxyServer/openRTSPClient commands. I didn't achieve this setup. Can you please explain what command I need to run to make this setup. Device1 -> ? (What program or command I need to use) Device2 -> ? (What program or command I need to use) If I need to use proxy server , then what is the command I need to use Thanks in Advance Regards, Krishna ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] Video Stream Rendering - Live555
Title: Samsung Enterprise Portal mySingle Hi All, I am currently testing the live555 code base for streaming functionality. I have did hands on with live555MediaServer and openRTSP client executable. I am able to receive the stream successfully. Currently its stored in file(For Eg: audio-MPA-1) Is there any support in live555 codebase for rendering the video or we need to write our own code for rendering? Please give a suggestion. Regards, Krishna ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel
[Live-devel] mp4 streaming in live555
Title: Samsung Enterprise Portal mySingle Hi Ross, I am able to stream mp3 files using live555MediaServer. Now I want to stream mp4 file, I have given a try with testMPEG4VideoStreamer executable. I have modified inputFileName as below #include "liveMedia.hh"#include "BasicUsageEnvironment.hh"#include "GroupsockHelper.hh" UsageEnvironment* env;char const* inputFileName = "myfilename.mp4"; In client side, I am not able to receive the stream. Please let us know the right procedure to transfer mp4 files. Regards, Krishna ___ live-devel mailing list live-devel@lists.live555.com http://lists.live555.com/mailman/listinfo/live-devel