Read the RTP/RTCP spec. (specifically the RTCP part). Then look at the
Live555 server code.
Matt S.
On Friday, August 19, 2011 9:09:32 AM, Ivan Maldonado Zambrano wrote:
Hi all,
In a previous mail I got this answer:
-
Live555 can't reduce the content, which is what determines the bit rate,
Live555 is just the mechanism for sending the content so Live555 can't
do anything to change the bit rate.
Since Live555 implements RTSP and RTP and those protocols have some
mechanisms to detect report bandwidth problems,
FYI,
I don't know what he has but it is possible that the data is encoded
in a separate stream in SMPTE KLV format. This data format is the
standard used by MISB (Motion Industry Standards Board) which is an
off shoot of military UAV programs.
They have a standard f
Ok, thanks for the input.
Matt S.
On Wednesday, August 03, 2011 6:51:37 PM, Ross Finlayson wrote:
Yes. Now, the SPS and PPS NAL units are assumed to be in the input
NAL unit stream (and are extracted from there).
Is that a safe assumption, isn't optional to include the SPS and PPS
NAL uni
On Wednesday, August 03, 2011 6:14:13 AM, Ross Finlayson wrote:
On Aug 2, 2011, at 6:45 PM, Matt Schuckmannn wrote:
I'm working on up grading our use of Live555 RTSP server code to the
latest version of the library, our old version was at least a couple
of years old.
Good heavens;
I'm working on up grading our use of Live555 RTSP server code to the
latest version of the library, our old version was at least a couple of
years old.
In the new code it appears that the default behavior is to obtain the
sps, pps, etc from the h.264 fragmenter (see auxSDPLine() for
H264Video
Checkout this thread
http://lists.live555.com/pipermail/live-devel/2011-June/013470.html
I was trying to do something similar to what you are doing and I think
I also started down the path you are headed and ultimately Ross showed
me a much much easier way to do what i wanted.
Matt S.
On
Thank you, I will test as soon as I can.
Matt S.
On 7/8/2011 1:48 AM, Ross Finlayson wrote:
Yes, the server code needs to be checking for the "Content-length"
header, if present (which it will be for "GET_PARAMETER" and
"SET_PARAMETER"). A fix will be coming...
FYI, I have now installed a n
Yes it works fine.
Thanks,
Matt S.
On 6/29/2011 11:17 PM, Ross Finlayson wrote:
The first time BasicTaskScheduler::SingeStep is called the fReadSet,
fWriteSet, and fExceptionSet are all empty and select() returns an
error (WSAINVAL) and the code starting on line 86 of
BasicTaskScheduler.cpp
and I'll make
public any suggestions I have but I assume you Ross will want to come up
with your own solution, and hopefully you'll be quicker than me.
Thanks,
Matt S.
On 6/29/2011 3:14 PM, Matt Schuckmannn wrote:
Looks like there is a bug in RTSPClientSession::handleRequestBytes()
Looks like there is a bug in RTSPClientSession::handleRequestBytes()
when SET_PARAMETER is sent to the server while streaming RTP over TCP
and after the PLAY command has been issued. The problem is the code
starting on or about line 422 of RTSPServer.cpp assumes that the the
message is complete
that openRTSP()
but I don't think I'm doing anything wrong and my code has worked just
fine for the last 2 years. Is this a bug in the library (maybe the
dummySocket shouldn't be added to the fWriteSet) or should I change the
way my code works?
Thanks,
Matt S.
On 6/29/2011 1
I just updated my code to use the latest Live555 code and I imeadiately
noticed that my RTSP client application is spinning on the processor and
taking up 100% of the processor it's running on. I tried running
OpenRTSP against the same server and I don't the problem. The main
difference between
Worked like a charm, thank you Ross.
Matt S.
On 6/14/2011 2:45 PM, Matt Schuckmannn wrote:
Ah I think I see what your saying I took your words
"ServerMediaSubsession (subclass)" to mean that you were suggesting
that I create an entirely new subclass type for each possible
comb
Ah I think I see what your saying I took your words
"ServerMediaSubsession (subclass)" to mean that you were suggesting that
I create an entirely new subclass type for each possible combination of
parameters but you meant a new instance of a sub class of
ServerMediaSubsession for each connectio
On 6/14/2011 11:59 AM, Jeff Shanab wrote:
By rtsp you mean as a client?
No I'm talking about modifying the RTSPServer which a client will
connect to.
The system often connects and is told by the server what it supports.
DESCRIBE,SETUP,PLAY
I am talking to security cameras and I am expecti
For example (to use your example string),
"live_video?height=320&width=400&kbps=300&fps=15" would use a
completely separate "ServerMediaSubsession" (subclass) object than
"live_video?height=320&width=400&kbps=300&fps=30".
This seems infeasible as I want to be able support any number of
co
Has anyone succeeded in modifying the LiveMedia classes to support
accepting parameters in the URI, something like the following to request
a specific height and width, bitrate, and framerate:
rtsp://media.server.com/live_video?height=320&width=400&kbps=300&fps=15
With a H.264 stream changing
Hi, I've had Live555 working on a iPhone as a RTSP client for some time.
I have noticed that sometimes on some iPhone's when they are on the cell
network I can not connect to the server. When this happens I've always
been able to shutdown my app, launch mail or safari to kick the
connection int
FYI it is possible.
I've had a demo application using Live555 running on the iPhone and iPad
for quite some time. Sadly it's just been a pet project and I haven't
had time to release it or even document how I got it all going.
As I recall it wasn't that hard, I think I ended up using Objective
Dude, he didn't ask you to modify your server, he asked if anybody out
there has successfully make the necessary changes to make it work,
perhaps in there own branch of the project. Since you have made this an
open source project there is a good chance somebody in the community of
users has don
My approach for this kind of stuff has been to put them in separate
threads (I'd probably create a thread for Live555) and then just use
some sort of thread safe communication for passing events and data
between the 2, e.g. a socket or thread safe queue, etc.
Matt S.
On 3/22/2010 2:27 AM,
SR() 0x83AA7E80 is always
subtracted from theNTP seconds value and this caused the reported
presentation time value to wrap around and report something way out in
the future (around 2038).
I really don't know if there is anything better that you could do here,
probably not.
Thanks for listeni
I'm using the livemedia library to receive a MPEG4 stream from a Flir
Thermocam A320 infrared camera.
The problem I'm seeing is the presentation times coming from the
afterGettingFrame callback start out reasonable then jump way ahead to
sometime around 2038 after the first SR packet is received
24 matches
Mail list logo