Re: [Live-devel] streaming big H264 I frame out, image broken

2017-10-17 Thread Eric_Hsieh
Hi Ross, Yes, you are right, set it as 0, the delay is gone. Thanks. One more question, streaming with high bitrate(4M or 8M), ffplay will show following. Do you have any suggestions we can do on server side? Thanks a lot. [rtsp @ 0x7f8c15022200] max delay reached. need to consume packet [rtsp

[Live-devel] streaming big H264 I frame out, image broken

2017-10-13 Thread Eric_Hsieh
Hi Rose, We are working on a camera. It is a pure ethernet camera with 1080p@30fps and max bitrate is 8Mbit/s or higher. Now, we are facing the image broken problem when I frame size over 100,000 bytes. I found this, http://live-devel.live.narkive.com/LBpgaBLU/errors-when-streaming-hd-h-264-usi

[Live-devel] May we have an opportunity to develop WebRTC with live555?

2017-05-25 Thread Eric_Hsieh
Hi Ross, Recently, we have a powerful camera and we would like to put webrtc into that camera. But, after searching infos from internet, live555 didn’t release webrtc officially. So, may we have the webrtc source code and put it into our camera? Thanks a lot. This electronic mail transmission is

[Live-devel] How to response custom rtsp header and limit the client number?

2017-04-23 Thread Eric_Hsieh
Hi Ross We are working on a small memory and poor cpu camera. we would like to 1) limit the number of client, do you have any suggestion about this? 2) if system cpu usage is very high, we want to response special custom rtsp header. for example: c->s SETUP rtsp://127.0.0.1/profile1/trac

Re: [Live-devel] How to implement upload multiple audio streaming to rtsp server?

2017-04-13 Thread Eric_Hsieh
Hi Ross Thanks for your reply. After past few days struggling, I think we found the key point. The key point is server side always use same port to handle RTP/RTCP packet. Transport: RTP/AVP;unicast;destination=127.0.0.1;source=127.0.0.1;client_port=46774-46775;server_port=6970-6971 Server alw

[Live-devel] How to limit the number of RTP packet which is received from MultiFramedRTPSource?

2017-04-12 Thread Eric_Hsieh
Hi Ross We are working on a small ram size camera. Sometimes, the performance of camera is not good. When we send a lot of data to camera, and handle by MultiFramedRTPSource. We found ReorderingPacketBuffer will keep run createNewPacket to allocate memory and queue data. We want to limit createN

Re: [Live-devel] How to implement upload multiple audio streaming to rtsp server?

2017-04-06 Thread Eric_Hsieh
Hi Ross We would like to ask a question about schedule. Thanks. We use following flow to receive data from client. MultiFramedRTPSource —> WAVFileSink If we run one client to upload audio stream, we can receive correct data. If we run two clients to upload audio stream. First one will be blocking.

Re: [Live-devel] How to drop client quickly when client is disconnection

2017-04-06 Thread Eric_Hsieh
Hi Ross Got it. Thanks for your help. > Ross Finlayson 於 2017年4月6日 16:42 寫道: > >> Using testOnDemandRTSPServer as rtsp server. >> Sometimes, client lose connection with server because of bad wifi. >> >> Now, we know server have a timeout 65 sec to keep this connection alive. >> But, we want to d

[Live-devel] How to drop client quickly when client is disconnection

2017-04-06 Thread Eric_Hsieh
Hi Ross Sorry to interrupt you again. I have a question need your hint. Thanks. We are working on a wifi camera, under bad wifi signal env. Using testOnDemandRTSPServer as rtsp server. Sometimes, client lose connection with server because of bad wifi. Now, we know server have a timeout 65 sec to

Re: [Live-devel] How to implement upload multiple audio streaming to rtsp server?

2017-04-05 Thread Eric_Hsieh
Hi Ross Thanks for your reply. I will try to describe more clear. First, our system is an camera, before we use upload audio feature. We use live555 as our rtsp server. When one client want to stream rtsp, we will create a video/audio source to capture data and send it out. The server is normal r

[Live-devel] How to implement upload multiple audio streaming to rtsp server?

2017-04-05 Thread Eric_Hsieh
Hi Ross, We are working on uploading audio streaming to rtsp server. Server receive the audio data via MultiFramedRTPSource and put data into our WAVSpeakerSink. We combine WAVAudioFileSource and WAVSpeakerSink to do some test, it works well. (We refer the sample code, MPEG2TransportStreamIndexe

Re: [Live-devel] How to update SDPLines info?

2016-04-26 Thread Eric_Hsieh
Hi Ross, Yes, it works. Thanks a lot. > Ross Finlayson 於 Apr 26, 2016 15:49 寫道: > >> Run rtsp server based on OnDemandServerMediaSubsession class. >> We found the server will always return the same SDP into back to the rtsp >> client, even we update the NEW SDP info when call createNewRTPSink.

[Live-devel] How to update SDPLines info?

2016-04-26 Thread Eric_Hsieh
Hi Ross, Run rtsp server based on OnDemandServerMediaSubsession class. We found the server will always return the same SDP into back to the rtsp client, even we update the NEW SDP info when call createNewRTPSink. So, my question is, how to control live555 library update SDP info? Thanks a lot. T

Re: [Live-devel] Why VLC show fixed fps when use H264VideoStreamDiscreteFramer to send H264 data?

2016-02-02 Thread Eric_Hsieh
Hi Ross, Thanks for your reply. Yes, you give me what I want. 1> H264or5VideoStreamFramer’s fFrameRate is not key point. 2> Make sure the rate is correct from encoder 3> Make sure the NAL units have correct rate setting. Thanks again. regards, eric, 02/02 > On Feb 2, 2016, at 13:03, Ross Finlays

[Live-devel] Why VLC show fixed fps when use H264VideoStreamDiscreteFramer to send H264 data?

2016-02-02 Thread Eric_Hsieh
Dear Ross, I have a question about the H264VideoStreamDiscreteFramer to stream H264 data. Using H264VideoStreamDiscreteFramer, I see the code. At H264or5VideoStreamFramer, there is a fFrameRate, I am sure it is correct fps(30). But when I use vlc to play the stream, VLC always show the fixed fps(

Re: [Live-devel] Live h264 streaming in Android

2016-01-04 Thread Eric_Hsieh
Hi Danielya, What you face now, it is exactly what I faced before. Try to check your video and audio timestamp. It will help you out. Thanks a lot. regards, eric, 01/05 On Jan 3, 2016, at 19:53, Daniel Yacouboff mailto:danie...@essence-grp.com>> wrote: Hello there, I’ve sub-classed some of the

Re: [Live-devel] Does live555 support SRTP or RTP SAVP/SAVPF?

2015-12-21 Thread Eric_Hsieh
Hi Ross, Thanks for your quickly reply, I will discuss with my boss. regards, eric, 12/21 > On Dec 21, 2015, at 16:51, Ross Finlayson wrote: > >> It seems live555 does not support SRTP or RTP SAVP/SAVPF, right? > > Not yet. However, there is partial support for SRTP in our experimental > RTSP

[Live-devel] Does live555 support SRTP or RTP SAVP/SAVPF?

2015-12-20 Thread Eric_Hsieh
Hi Ross, I checked the data from the Internet. It seems live555 does not support SRTP or RTP SAVP/SAVPF, right? If we want live555 to support it, how? Please advice. Thanks a lot. regards, eric, 12/21 This electronic mail transmission is intended only for the named recipient. It contains informa

[Live-devel] 回覆: H264 + ulaw streaming with audio vibration

2015-12-18 Thread Eric_Hsieh
, with the tolerance on that rate 50 parts per million (ppm) -> should we resample the audio source from 16KHz to 8KHz? then encode source to be ulaw data? Thanks a lot. regards, eric, 12/18 從: Eric_Hsieh 謝國達 (HQ) 寄件日期: 2015年12月17日 下午 08:13 至: LIVE555 Stream

[Live-devel] 回覆: H264 + ulaw streaming with audio vibration

2015-12-17 Thread Eric_Hsieh
Hi Ross, Thanks for your quickly replay again. Yes, according to your suggestion. fDurationInMicroseconds = 256*100/ 16000 = 16000 The 16000 is same with what I mention early. So, I am very confusing about the audio vibration. Thanks a lot. regards, eric, 12/17 __

Re: [Live-devel] H264 + ulaw streaming with audio vibration

2015-12-17 Thread Eric_Hsieh
Hi Ross, Thanks for your quickly reply. Each time, we will put 1024 bytes to ulaw encode and it produce 512 bytes ulaw data out. Then we copy 512 bytes to fTo and put 512 to fFrameSize. For fDurationInMicroseconds, we set 16000. I think it should be right. But I don’t know why 16K, stereo audio

Re: [Live-devel] H264 + ulaw streaming with audio vibration

2015-12-16 Thread Eric_Hsieh
Hi Ross, Thanks for your quickly replay. We have a rtsp server on embedded system, rtsp server will stream out H264+ulaw streaming. Now, we face a problem. There is a vlc client to connect our rtsp server. If the audio is A. under 8K, 16bit, mono ulaw(encode from PCM), we set fDurationInMicrosec

[Live-devel] H264 + ulaw streaming with audio vibration

2015-12-16 Thread Eric_Hsieh
Hi Ross, Now, we face a weird situation under H264+ulaw streaming. Please listening this: https://dl.dropboxusercontent.com/u/38773854/vibrating_voice.mp4 H264 with 8K, mono, 16bit ulaw streaming is ok. We set fDurationInMicroseconds to be 64000. But, H264 with 16K, stereo, 16bit ulaw streaming

[Live-devel] 回覆: Live video and audio streaming using one RTSPServer

2015-11-04 Thread Eric_Hsieh
Hi Daniel, First, you should ensure your time between audio and video are sync. Then, just put struct timval into fPresentation time(need sec and usec). Second, ensure your audio duration time is ok, you could refer ATDS source code. I hope this may help you. Thanks?. regards, eric, 11/04

Re: [Live-devel] How to sync H264 and AAC timestamp from live streaming

2015-10-21 Thread Eric_Hsieh
Dear Ross, From your previous suggestions, if we want to deliver live streaming out via live555, we should use H264VideoStreamDiscreteFramer and set fDurationInMicroseconds to be 0. Now, our RTSP server stream H264+AAC works well with one client. If there is the second client to connect our serv

Re: [Live-devel] [Spam Mail] How to sync H264 and AAC timestamp from live streaming

2015-10-14 Thread Eric_Hsieh
Dear Deanna, Now, we can get response from RR report and drop P frames according to it. But, there are some issues on it. First of all, we should explain some thing let you understand what we faced. 1. We will stream H264/AAC at same time and get those frames from ring buffer(share memory). 2. A

Re: [Live-devel] How to sync H264 and AAC timestamp from live streaming

2015-10-07 Thread Eric_Hsieh
Dear Deanna Earley, Maybe that is a good idea to met what we need. Let me try it first. Thanks a lot. regards, eric, 10/07 On Oct 7, 2015, at 16:06, Deanna Earley mailto:dee.ear...@icode.co.uk>> wrote: From: live-devel [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson Sen

Re: [Live-devel] How to sync H264 and AAC timestamp from live streaming

2015-10-06 Thread Eric_Hsieh
Hi Ross, Thanks for quickly response. 1. We are very sure the fPresentationTime are accurate. We guess the video block issue. It will happen heavy video bitrate on low upload bandwidth. So, does live555 support flow control? 2. Yesterday, we set up a VGA@1fps MJPEG to do streaming. It

Re: [Live-devel] How to sync H264 and AAC timestamp from live streaming

2015-10-05 Thread Eric_Hsieh
Dear Ross, Thanks for your kindly helping. Now, we use H264VideoStreamDiscreteFramer to do live streaming, it is better than before. And we sure the timestamp is correct and sync with “wall clock time”. But, we still have some questions, need your help. 1. Now, H264+AAC is working well. After

[Live-devel] How to sync H264 and AAC timestamp from live streaming

2015-10-02 Thread Eric_Hsieh
Dear Ross, I am a new guy to learn live555 and porting it into our platform. Now, we are coding a RTSP server based on testOnDemandRTSPServer sample code. We create 4 new classes to read H264 and AAC frames from our ring buffer not file. Each time call doGetNextFrame(), the class will deliver a “