Hi Ross,
Yes, you are right, set it as 0, the delay is gone. Thanks.
One more question, streaming with high bitrate(4M or 8M), ffplay will show
following.
Do you have any suggestions we can do on server side? Thanks a lot.
[rtsp @ 0x7f8c15022200] max delay reached. need to consume packet
[rtsp
Hi Rose,
We are working on a camera. It is a pure ethernet camera with 1080p@30fps and
max bitrate is 8Mbit/s or higher. Now, we are facing the image broken problem
when I frame size over 100,000 bytes. I found this,
http://live-devel.live.narkive.com/LBpgaBLU/errors-when-streaming-hd-h-264-usi
Hi Ross,
Recently, we have a powerful camera and we would like to put webrtc into that
camera.
But, after searching infos from internet, live555 didn’t release webrtc
officially.
So, may we have the webrtc source code and put it into our camera?
Thanks a lot.
This electronic mail transmission is
Hi Ross
We are working on a small memory and poor cpu camera.
we would like to
1) limit the number of client, do you have any suggestion about this?
2) if system cpu usage is very high, we want to response special custom rtsp
header.
for example:
c->s
SETUP rtsp://127.0.0.1/profile1/trac
Hi Ross
Thanks for your reply.
After past few days struggling, I think we found the key point.
The key point is server side always use same port to handle RTP/RTCP packet.
Transport:
RTP/AVP;unicast;destination=127.0.0.1;source=127.0.0.1;client_port=46774-46775;server_port=6970-6971
Server alw
Hi Ross
We are working on a small ram size camera.
Sometimes, the performance of camera is not good.
When we send a lot of data to camera, and handle by MultiFramedRTPSource.
We found ReorderingPacketBuffer will keep run createNewPacket to allocate
memory and queue data.
We want to limit createN
Hi Ross
We would like to ask a question about schedule. Thanks.
We use following flow to receive data from client.
MultiFramedRTPSource —> WAVFileSink
If we run one client to upload audio stream, we can receive correct data.
If we run two clients to upload audio stream. First one will be blocking.
Hi Ross
Got it. Thanks for your help.
> Ross Finlayson 於 2017年4月6日 16:42 寫道:
>
>> Using testOnDemandRTSPServer as rtsp server.
>> Sometimes, client lose connection with server because of bad wifi.
>>
>> Now, we know server have a timeout 65 sec to keep this connection alive.
>> But, we want to d
Hi Ross
Sorry to interrupt you again. I have a question need your hint. Thanks.
We are working on a wifi camera, under bad wifi signal env.
Using testOnDemandRTSPServer as rtsp server.
Sometimes, client lose connection with server because of bad wifi.
Now, we know server have a timeout 65 sec to
Hi Ross
Thanks for your reply. I will try to describe more clear.
First, our system is an camera, before we use upload audio feature.
We use live555 as our rtsp server. When one client want to stream rtsp,
we will create a video/audio source to capture data and send it out.
The server is normal r
Hi Ross,
We are working on uploading audio streaming to rtsp server.
Server receive the audio data via MultiFramedRTPSource and put data into our
WAVSpeakerSink.
We combine WAVAudioFileSource and WAVSpeakerSink to do some test, it works well.
(We refer the sample code, MPEG2TransportStreamIndexe
Hi Ross,
Yes, it works. Thanks a lot.
> Ross Finlayson 於 Apr 26, 2016 15:49 寫道:
>
>> Run rtsp server based on OnDemandServerMediaSubsession class.
>> We found the server will always return the same SDP into back to the rtsp
>> client, even we update the NEW SDP info when call createNewRTPSink.
Hi Ross,
Run rtsp server based on OnDemandServerMediaSubsession class.
We found the server will always return the same SDP into back to the rtsp
client, even we update the NEW SDP info when call createNewRTPSink. So, my
question is, how to control live555 library update SDP info? Thanks a lot.
T
Hi Ross,
Thanks for your reply.
Yes, you give me what I want.
1> H264or5VideoStreamFramer’s fFrameRate is not key point.
2> Make sure the rate is correct from encoder
3> Make sure the NAL units have correct rate setting.
Thanks again.
regards, eric, 02/02
> On Feb 2, 2016, at 13:03, Ross Finlays
Dear Ross,
I have a question about the H264VideoStreamDiscreteFramer to stream H264 data.
Using H264VideoStreamDiscreteFramer, I see the code.
At H264or5VideoStreamFramer, there is a fFrameRate, I am sure it is correct
fps(30).
But when I use vlc to play the stream, VLC always show the fixed fps(
Hi Danielya,
What you face now, it is exactly what I faced before.
Try to check your video and audio timestamp.
It will help you out.
Thanks a lot.
regards, eric, 01/05
On Jan 3, 2016, at 19:53, Daniel Yacouboff
mailto:danie...@essence-grp.com>> wrote:
Hello there,
I’ve sub-classed some of the
Hi Ross,
Thanks for your quickly reply, I will discuss with my boss.
regards, eric, 12/21
> On Dec 21, 2015, at 16:51, Ross Finlayson wrote:
>
>> It seems live555 does not support SRTP or RTP SAVP/SAVPF, right?
>
> Not yet. However, there is partial support for SRTP in our experimental
> RTSP
Hi Ross,
I checked the data from the Internet.
It seems live555 does not support SRTP or RTP SAVP/SAVPF, right?
If we want live555 to support it, how?
Please advice. Thanks a lot.
regards, eric, 12/21
This electronic mail transmission is intended only for the named recipient. It
contains informa
, with the tolerance on that rate 50 parts per
million (ppm)
-> should we resample the audio source from 16KHz to 8KHz? then encode source
to be ulaw data?
Thanks a lot.
regards, eric, 12/18
從: Eric_Hsieh 謝國達 (HQ)
寄件日期: 2015年12月17日 下午 08:13
至: LIVE555 Stream
Hi Ross,
Thanks for your quickly replay again.
Yes, according to your suggestion.
fDurationInMicroseconds = 256*100/ 16000 = 16000
The 16000 is same with what I mention early.
So, I am very confusing about the audio vibration.
Thanks a lot.
regards, eric, 12/17
__
Hi Ross,
Thanks for your quickly reply.
Each time, we will put 1024 bytes to ulaw encode and it produce 512 bytes ulaw
data out.
Then we copy 512 bytes to fTo and put 512 to fFrameSize.
For fDurationInMicroseconds, we set 16000.
I think it should be right. But I don’t know why 16K, stereo audio
Hi Ross,
Thanks for your quickly replay.
We have a rtsp server on embedded system, rtsp server will stream out H264+ulaw
streaming.
Now, we face a problem. There is a vlc client to connect our rtsp server.
If the audio is
A. under 8K, 16bit, mono ulaw(encode from PCM), we set fDurationInMicrosec
Hi Ross,
Now, we face a weird situation under H264+ulaw streaming.
Please listening this:
https://dl.dropboxusercontent.com/u/38773854/vibrating_voice.mp4
H264 with 8K, mono, 16bit ulaw streaming is ok. We set fDurationInMicroseconds
to be 64000.
But, H264 with 16K, stereo, 16bit ulaw streaming
Hi Daniel,
First, you should ensure your time between audio and video are sync.
Then, just put struct timval into fPresentation time(need sec and usec).
Second, ensure your audio duration time is ok, you could refer ATDS source code.
I hope this may help you. Thanks?.
regards, eric, 11/04
Dear Ross,
From your previous suggestions, if we want to deliver live streaming out via
live555,
we should use H264VideoStreamDiscreteFramer and set fDurationInMicroseconds to
be 0.
Now, our RTSP server stream H264+AAC works well with one client.
If there is the second client to connect our serv
Dear Deanna,
Now, we can get response from RR report and drop P frames according to it.
But, there are some issues on it.
First of all, we should explain some thing let you understand what we faced.
1. We will stream H264/AAC at same time and get those frames from ring
buffer(share memory).
2. A
Dear Deanna Earley,
Maybe that is a good idea to met what we need.
Let me try it first. Thanks a lot.
regards, eric, 10/07
On Oct 7, 2015, at 16:06, Deanna Earley
mailto:dee.ear...@icode.co.uk>> wrote:
From: live-devel [mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross
Finlayson
Sen
Hi Ross,
Thanks for quickly response.
1. We are very sure the fPresentationTime are accurate.
We guess the video block issue. It will happen heavy video bitrate on low
upload bandwidth.
So, does live555 support flow control?
2. Yesterday, we set up a VGA@1fps MJPEG to do streaming.
It
Dear Ross,
Thanks for your kindly helping.
Now, we use H264VideoStreamDiscreteFramer to do live streaming, it is better
than before.
And we sure the timestamp is correct and sync with “wall clock time”.
But, we still have some questions, need your help.
1. Now, H264+AAC is working well.
After
Dear Ross,
I am a new guy to learn live555 and porting it into our platform.
Now, we are coding a RTSP server based on testOnDemandRTSPServer sample code.
We create 4 new classes to read H264 and AAC frames from our ring buffer not
file.
Each time call doGetNextFrame(), the class will deliver a “
30 matches
Mail list logo