Thanks,
I understand about the Windows thing, fyi I tried it as a typedef and it
seemed fine.
Matt S.
On 5/9/2013 6:42 PM, Ross Finlayson wrote:
1. config.armlinux has carriage return line feed line endings and
none of the other config files have this.
OK, this will be fixed in the next re
I just thought I'd bring up a couple of minor code issues that I
recently ran into.
1. config.armlinux has carriage return line feed line endings and none
of the other config files have this. Normally I wouldn't care but my
automated build system uses quilt and patch to add -D defines to the
There is one really good reason to support shared libraries and that is
to make it easier to full fill the obligations of the LGPL.
The way it is now if you have a closed source application using this
library and someone else wants to build thier own version of liveMedia
to use with your applica
Very cool, thanks for keeping up on this. Bet it feels good that the
industry is finally moving in the direction that you've been working on
for so long.
Matt S.
On Thursday, November 08, 2012 2:29:30 PM, Ross Finlayson wrote:
Believe me, I've been keeping *very* close track of the WebRTC wor
Are you sure you're calling sendTeardownCommand() from the liveMedia
thread?
Typically I've seen random crashes like this when calling a liveMedia
method from the wrong thread, usually inadvertently.
Matt S.
On Thursday, November 08, 2012 10:26:16 AM, Erlandsson, Claes P
(CERLANDS) wrote:
I'
Hi Ross,
I was wondering if you had been watching the developments with WebRTC
and it's acceptance by a few of the major browser players (Chrome,
Firefox, and Opera)
Based on my first pass looking at it, it looks like the transport is RTP
with another signalling protocol called ROAP instead o
Hi Ross,
We are putting a rather large custom calibration structure (xml text) in
the SDP for our media sessions and we ran into a limit on the server side.
In the RTSPServer code the fResponseBuffer and fRequestBuffer are both
limited by the #define of RTSP_BUFFER_SIZE which is set to 1.
There are more advanced options, depending on your h.264 encoder. For
example the TI h.264 encoder supports a couple of features, Temporal
Scalability and Chain Free P Frame encoding.
http://processors.wiki.ti.com/index.php/H.264_DM36x_Ver_2.0_Codec#Scalable_video_coding_.E2.80.93_Temporal
http
Ok cool thanks.
Matt S.
On Friday, September 28, 2012 2:04:43 PM, Ross Finlayson wrote:
I'm not quite sure how it works if fReuseFirstSource is set and one
client requests a TCP connection and another a UDP?
Aha! There's actually a bug in the current code that prevents this
from working prop
Hi,
I noticed that you updated the changelog for this release but the
release has not been posted.
Is this an oversight or are you still working on testing, etc?
Thanks,
Matt S.
On Thursday, September 27, 2012 7:12:52 AM, Ross Finlayson wrote:
FYI - the latest release (2012.09.27) of the "LIV
On Wednesday, September 26, 2012 8:04:38 PM, Ross Finlayson wrote:
As part of upgrading our version of LiveMedia I'm reminded of a
modification that we put in our code and suggested to you but
apparently never made it into the code.
We put in a way to set an upper limit on the ports available
On 9/27/2012 7:06 AM, Ross Finlayson wrote:
Note that, in contrast, an "OnDemandServerMediaSubsession" object is
the *wrong* thing to be inspecting, because that class represents a
piece of media that can be streamed, possibly several different times
(sequentially or concurrently), to many dif
In looking at our old code we were looking at fDestinationsHashTable to
determine if a stream was RTP over TCP or RTP over UDP (for a server
sessions status widget).
This was done before you put the StreamState and Destinations classes
in the header and the guy that did it back then added a
Get
Ross,
As part of upgrading our version of LiveMedia I'm reminded of a
modification that we put in our code and suggested to you but apparently
never made it into the code.
We put in a way to set an upper limit on the ports available for
RTP/RTCP use. We did this for servers that are behind a
Thank you.
While your at it I noticed that the member fDestinationsHashTable of
OnDemandServerMediaSubsession is private while the class Destinations
is declared in the header with a comment about it being accessible by
subclasses. In addition many of the methods that interact with
fDestinatio
I'm working on upgrading our code to the latest version of LiveMedia
(our existing version is quite old) and we had sub classed RTSPClient
and implemented a sendSetParametersCommand that sent a SET_PARAMETER
command with multiple parameters set in the body to avoid multiple TCP
round trips.
I
In the 21s century people use things like smart pointers and the STL
and memory leaks rarely occur.
Matt S.
On Thursday, September 20, 2012 6:55:36 AM, ЗвягинцевАнтон wrote:
Ross Finlayson writes:
But the best/easiest way to reclaim all of the memory for all of the
LIVE555-allocated objec
I actually ran into this very same problem a while back. We had a
customer trying to stream live low frame rate/low bit rate video over a
slow military 3G cellular network and things weren't working, I can't
remember all the details now but either the client or the server was
hanging up because
Shlomi,
I'm doing pretty much the same thing you are on the same processor
(except with my own app and a live 3Mpix camera source) and I've seen
the same thing, and in my experience the CPU utilization goes up in
direct relationship to the streaming bit rate. For me anything over
about 20mbps
long as part of
the stream with the rest of the TS video information through the Live555
Dynamic RTSP server and onto the client to handle Demuxing the KLV?
Thank you for your help,
AJ
-Original Message-
From: Matt Schuckmann [mailto:m...@schuckmannacres.com]
Sent: Monday, June 04, 2012
I believe the answer is it won't handle it at all.
Last time I checked the KLV profile for RTP (IETF draft status) had not
been accepted as a standard and I don't think it's all that interesting
outside of the MISB standards body so no effort has been made to make
it a part of this library.
Yo
I can actually see using this feature as well.
Matt S.
On Thursday, May 31, 2012 7:21:06 AM, Wiser, Tyson wrote:
More seriously we need to seek in absolute mode because we would like
to query video according to the time we recorded it (and not a
relative time relative to begin of RTSP session).
Basically create a Xcode project for each of the Live555 libraries
yourself.
I did it a while back and it worked fine.
Matt S.
On 4/20/2010 7:42 AM, Denis Z wrote:
Hi!
We would like to use live555 RTSP client on iPhone, but we faced the
problem - we don't know how to build live555 for iPhone
On 4/1/2010 7:06 PM, Ross Finlayson wrote:
I've found out that when you run a few different instances of RTSP
clients in separate threads CSeq number is not increased by one with
each consecutive request.
It's because CSeq number is a static variable in RTSPClient.
This is a perfect illustrat
Ross, I have to disagree with you a little here.
Shouldn't it be the servers responsibility to provide the best possible
experience to the client given the network conditions, the content
delivery method and the content requested by the client.
And if the server can detect that the network is to
We use MJPEG over RTP to send on demand hi resolution still images along
side our video which is lower res H.264.
Is there an extension to the MJPEG standard that allows JPEG over 2k x
2k? If so we'd be interested in implementing it.
Matt S.
Cristiano Belloni wrote:
Ross Finlayson wrote:
Som
Doesn't the watchVariable solution generally require you to setup some
sort of scheduled periodic event scheduleDelayedTask so that the event
loop is guarantied to wake up and check it every so often?
For consuming data from a separate thread I've always preferred skipping
the watch variable a
I was just trying to import your latest changes into our copy of
liveMedia (actually all your changes from Jan of this year) and I found
what I think is a typo in basicTaskScheduler.cpp on line 78.
I think you should change: err = 0;
to: err = EINTR;
The way it is now the if test on line 88 will
I would never make any assumptions about a key frame every 12 frames. I
don't know exactly what these atoms are but I regularly use FFMPEG and
other encoders to create H.264 streams with very different key frame
intervals.
Matt S.
Ross Finlayson wrote:
OK, I've now released a new version (20
tFrame() is called.
If Pat and I are missing something could you please fill us in.
Thanks
Matt Schuckmann
iMove Inc.
msch...@imoveinc.com
Georges Côté wrote:
Thank you all for your help.
It looks like the problem is on my side. One engineer, before he left,
started implementing Forward E
Chances are your socket receiver buffers in the OS are too small.
Try increasing them with calls to setReceiveBufferTo() or
increaseReceiveBufferTo(), I think you can find examples of this in the
OpenRTSP example and I think there are some references to this in the
FAQ. Check out the FAQ becaus
Matt Schuckmann wrote:
Furthermore I'd be very suspicious of even doing what is suggested in
in the FAQ.
I know of at least one place in the code where things could go very
wrong:
The RTPInterface.cpp code uses 2 static hash tables when using RTP
over TCP. Should you run 2 separate th
I'm highly suspicious of running live555 in multiple threads even if you
do follow the FAQ.
The FAQ basically suggests that you can run 2 independent copies of
Live555 in separate threads and those 2 copies can *NOT* interact except
via global variables. I don't know for sure but I don't think t
You have inadvertently pointed out something we missed and that is how
you *intended* the server should force a shutdown of an active session
or a live feed. We had been simply destroying the RTSPClientSession
which is what happens when a TEARDOWN message has been received. And
this is how I di
Did you actually read what I said: it's in the line you quoted below
When the *TEARDOWN* message is sent by the client to the server and the
server is shutting down the session the server does not successfully
send a BYE message and it appears that it should. This *is* the behavior
is regardles
I understand that it's hard to test bugs on modified code, I'd submit my
modifications to the project but you've already told me that you won't
accept some of them (I understand your reasons), and I'm not ready to
submit the others. I'm only trying to help you out by reporting what I
see, I hav
thing to do.
The shutdown code for the streams is very confusing and I'm not sure
how to proceed to get things working the way I'd like, can you make
any suggestions?
Thanks,
Matt S.
Matt Schuckmann wrote:
I've been trying to determine why my Live555 based RTSP client is
never seei
e for the streams is very confusing and I'm not sure how
to proceed to get things working the way I'd like, can you make any
suggestions?
Thanks,
Matt S.
Matt Schuckmann wrote:
I've been trying to determine why my Live555 based RTSP client is
never seeing the RTCP BYE messages
I've been trying to determine why my Live555 based RTSP client is never
seeing the RTCP BYE messages from the LIVE555 server object (i.e. my bye
handler is never getting called).
In reading the code it looks like the server RTCP code aways combines
the BYE packet with the SR packet and it look
We have seen similar problems, i.e. the same port numbers being assigned
to 2 different streams.
Basically our client would work for several successive instantiations
and shutdowns of the client app and each time the OS (or whatever it is
that hands out the next available port number) would in
throw away
anything that's not a alpha character (as determined by the isAlpha()
function). Sound reasonable?
Thanks
Matt S.
Matt Schuckmann wrote:
Thanks for the info.
I'm working on writing a RTSPRequest handler class to handle receiving
request, across multiple transport pa
Thanks for the info.
I'm working on writing a RTSPRequest handler class to handle receiving
request, across multiple transport packets, deal with the Content-length
header, and I'll try to roll the ParseRTSPRequestString() code into it.
I'm also going to add some minimal code to ignore interlea
This is sort of related to my last message, mostly because I found the 2
problems at the same time and around the same place in the code.
I'm testing RTP over TCP (client and server are both based on LiveMedia)
and occasionally (perhaps 50% of the time) the server is responding to
the PLAY com
I see a problem with the while loop for detecting the end of a RTSP
command in RTSPServer::RTSPClientSession::incomingRequestHandler1() and
the commands SET_PARAMETER and GET_PARAMETER either of which with actual
parameters to set.
Basically the while loop looks for and then determines
that
I noticed that the following lines appear near the end of
RTSPClient::PlayMediaSession but not in RTSPClient::PlayMediaSubsession().
if (fTCPStreamIdCount == 0) { // we're not receiving RTP-over-TCP
// Arrange to handle incoming requests sent by the server
envir().taskScheduler().turnO
Right thanks for the feedback.
We'll looking to making the changes to our subclasses.
Ross Finlayson wrote:
I've now released a new version (2009.02.13) that includes some, but
not all, of your suggested changes.
A synopsis of the changes are listed below.
1. Modified BasicUsageEnvironment0
Ross Finlayson wrote:
So while I debate the merits of changing RTSPClient I went ahead and
with my original idea of creating a second TCP connection once the
first is taken over for streaming and by golly it works (with the
server mods I mentioned before). I don't have all the kinks and
corn
Ah yes now I see how the routing to the correct stream handlers work,
that's a peace of work. Took me a while to see the statics in
RTPInterface.cpp and see how all the routing works.
So once I figured that out I started to plumb in a callback so that the
SocketDescriptor could notify a non RT
Matt Schuckmann wrote:
I need to make RTSP commands work when streaming RTP over TCP because
I need to be able to use the SET_PARAMETER and GET_PARAMETER commands
to control my live camera during the session.
The way I thought I understood the problem with the current
implementation is that
In looking at openConnectionFromURL() I think I see a potential problem
with fBaseURL.
Basically the first thing this method does is free fBaseURL and then
strDup the new URL.
Next this method parses the url and then checks to see if it a input
socket is open and if not opens one using the URL.
I need to make RTSP commands work when streaming RTP over TCP because I
need to be able to use the SET_PARAMETER and GET_PARAMETER commands to
control my live camera during the session.
The way I thought I understood the problem with the current
implementation is that once the PLAY command is
ssion id.
The other option would have been to move the definition of StreamState
to a header file and expose that structure to the rest of the code.
We felt that the path we choose was less invasive and more in line with
how the code works now.
Matt S.
Gabriele De Luca wrote:
In answer to Mat
ary and all the support.
Matt Schuckmann
A synopsis of the changes are listed below.
1. Modified BasicUsageEnvironment0::reportBackgroundError() to get the
errorResultMsgBuffer contents via a call to getResultMsg() instead of
accessing the private member directly so that it will work with deri
Ross Finlayson wrote:
I'm wondering what I can do when fNumTruncatedBytes is > 0 on either
the client or server side.
I don't really understand this question. "fNumTruncatedBytes" is a
variable that is *set* (not queried) by each "FramedSource" subclass
(including any new "FramedSource" su
I to have similar requirements in that a user may need to reconfigure
the server ( different port, different sessions), etc without shutting
down the main process.
Renato MAURO (Libero) wrote:
Hi Ross.
Right now I'm not sure there's a clean way to delete RTSP sessions
unless they are actu
Title: Extending RTSPClinetSession class
We had the same problem. We made the RTSPClientSession class protected
and added a protected virtual factory method for creating the
RTSPClientSession objects and made IncomingRequestHandler1() call it.
I then made my own version RTSPServer and RTSPCli
I'm wondering what I can do when fNumTruncatedBytes is > 0 on either the
client or server side.
On the server side the problem arises when my NAL unit is larger than
the buffer allocated when the StreamFramer object is created and we
don't know how big of a NAL might come along.
On the clien
Is there a way (method or property in the RTSP server class or
something) to restrict the ports the server can use for RTP/RTCP ports.
I want to do this because I've got a server sitting behind a
firewall/NAT and I can setup a block of ports to allow bidirectional UDP
traffic for this purpose. I
Ross Finlayson wrote:
After doing a bit more digging in the code and on the web I've
discovered that the problem of not receiving RTSP commands after the
play command when using RTP-over-TCP streaming is a known problem
(although it is usually associated with the keep-a-live feature
(cough h
I agree the first is very good, I haven't read the others.
Matt S.
Renato MAURO (Libero) wrote:
Hi Ross.
Maybe I'm off topic, but I'd like to know your opinion about these
books:
1)"RTP, Audio and Video for the Internet", Colin Perkins, 2003,
ISBN: 0-672-32249-8
2) "Timecode A Use
). Has any work ever been done in this
direction?
Thanks for listening to my rambling.
Matt S.
Matt Schuckmann wrote:
After doing a bit more digging in the code and on the web I've
discovered that the problem of not receiving RTSP commands after the
play command when using RTP-over-TCP stre
s and discussion are welcome.
I'm still working on figuring out problem 1 from my original post.
Thanks
Matt S.
Matt Schuckmann wrote:
Ok I've got a little more information.
With respect to problem #2 I've discovered that the server is not
receiving or responding to any RTSP
ction it's streaming over to send and receive
further RTSP commands and responses or does it open up new connections
in the same way a it would if streaming over UDP?
It seems like the later would work and be easer to implement but I'm
getting the feeling it's the former.
Thanks,
have would be appreciated.
Any suggestions you might have would be most appreciated, I was hoping
that maybe there is a flag or something that I'm not respecting
(particularly with problem #1). I understand I'm sort of on the harry
edge here so I understand if the only clue you
That's pretty much what I thought, you seem pretty knowledgeable in this
area and I wanted to see if anything popped out at you.
I haven't got much of a response from the FFMPEG list so I think I
pretty much on my own.
Thanks
Matt S.
Ross Finlayson wrote:
8. Here is where the 80% comes in. Mos
I'm looking for suggestions on the proper way to signal to the client
that the frame size has changed in an H264 stream.
What I have actually works about %80 of the time.
Here is what I have.
1. Server is encoding camera data in real time using X264 and serving
the data via Live555 RTSP server
Probably your best bet is to create a managed to unmanaged C++ interop
library that exposes the functionality that you need in managed
classes.Your VB.NET application will then be able to import and use
your managed interfaces. This is the way I have approached it.
Matt S.
Jack A Short wrot
Ross Finlayson wrote:
Ross you had suggested that I perhaps start by implementing a new
virtual method on RTSPServer for SET_PARAMETER.
I looked into doing this in the code and I realized that RTSPServer
doesn't actually handle most of the commands for each session
instance, it's RTSPServer
or were you
thinking of something else?
Thanks for you input, and the great library
Matt Schuckmann
___
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel
Title: Re: [Live-devel] RTP Extension
What's wrong with putting your meta-data in another stream that is time
synchronized with the H264 stream. I believe that is the intended way
to do such a thing in RTP.
Just curious,
Matt S.
Yedidia Amit wrote:
I mean to RTP Header Extens
Thanks for the input, what would be the proper way to go about
implementing this? Do I make the modifications and then submit a patch
back to you, or just implement it in my own build or pay you to do it or
? . I didn't see any developer guide lines or how to get to your
CVS/Subversion/whatever
ork.
I've tried to look around to see if anybody else has gone any of these
routes but I haven't found much; the RTSP RFC does mention something
about using the SET_PARAMETER method for PTZ control but it doesn't go
into any great detail. Any comments or suggestions from the expert
measure of "how late" the picture was
> received relative to when VLC calculated it should be displayed. I
> believe the resolution is microseconds, so 25 is .25 seconds late.
> Seems like you might have a problem with your presentation times?
> Unfortunately, that's abo
I'm working on prototyping a live H.264 streaming application using the
LiveMedia rtsp/rtp library and the X264 encoder.
I've written my h264Framer class and integrated it into the
testOnDemandRTSPServer and I can successfully stream the data to the
OpenRTSP test application.
However, when I try
74 matches
Mail list logo