I went with a 1 thread per rtsp client and regret it from a scaling
perspective. The stack mem needed plus the overhead of having a thread gets to
be too much when you get into a large amounts of streams.
The library is _very_ well written to be non blocking and given the speed of
your slowes
On 7/27/2012 5:07 PM, Ross Finlayson wrote:
I'm
puzzled by why so many programmers these days seem afraid (or unaware)
to do this.
I think it's a legacy of 1990s Windows and Mac OS: poor IPC, poor
process management, etc. But they had threads, which avoids the need
for either.
Now we've g
> This is fascinating but I can't picture it. What would the design look like
> for multiple processes (one per stream as you describe)?
>
> Are there any examples of this that I can take a look at?
Sure. For a (very) simple example, imagine a shell script like the following:
#! /bin/
Ross,
This is fascinating but I can't picture it. What would the design look
like for multiple processes (one per stream as you describe)?
Are there any examples of this that I can take a look at?
Thanks,
Tim
On Fri, Jul 27, 2012 at 1:23 PM, Ross Finlayson wrote:
> When implementing liveMedia
> When implementing liveMedia using multiple streams in one process I see two
> choices:
>
> 1. Each stream is kept totally separate. I.e. each stream have their own
> TaskScheduler, UsageEnvironment, eventLoopWatchVariable and each
> doEventLoop() is running in a separate thread.
>
> 2. The rtsp
When implementing liveMedia using multiple streams in one process I see two
choices:
1. Each stream is kept totally separate. I.e. each stream have their own
TaskScheduler, UsageEnvironment, eventLoopWatchVariable and each
doEventLoop() is running in a separate thread.
2. The rtspClient's share
>I am having difficulty understanding the concept for two reasons, first as
> I said, only the MediaSink class has access to the newest incoming frame.
> How do you envision one transferring this new frame to say ... a GUI for
> display?
Because I don't know anything about the GUI software
Hi Ross,
Thank you for your response.
I used the FAQ often when I first started with LiveMedia and the
single-threaded event loop is well described there. Given your response,
does that imply that ultimately I should dispatch new frames to the rest of
the program using the TaskScheduler, Us
I'm not sure I totally understand your question, but it's important to
understand that the "LIVE555 Streaming Media" code is designed to be used
within applications that are structured as a single-threaded event loop.
Therefore, if you're using the LIVE555 code within a separate application, the
Hi Ross,
I was hoping to get your thoughts on how you envision client programs
should use the livemedia library. I have set up my entire system so far
using your examples and an OO C++ approach. My incoming stream is an h264
stream, so I've subclassed the MediaSink object, calling it H264Packe
: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Design
>My problem is that I don't see any relation between
>"ServerMediaSubsession" and "MediaSubsession".
That's because there isn't one. "MediaS(ubs)ession" is used by RTP
re
My problem is that I don't see any relation between
"ServerMediaSubsession" and "MediaSubsession".
That's because there isn't one. "MediaS(ubs)ession" is used by RTP
receivers (clients); "ServerMediaS(ubs)ession" is used by RTSP
servers.
If you want to implement the RTSP "GET_PARAMETER" ope
n NPT, etc.
Should I use the Medium::lookUpByName function or something like that?
Regards,
Ruud
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Ross Finlayson
Sent: donderdag 12 juni 2008 0:54
To: LIVE555 Streaming Media - development & use
Subject:
I want to add some parameters to the GET_PARAMETER call, like
position. Now I found a function MediaSubsession::getNormalPlayTime
which i could use (i guess), but I have no clue how to access a
MediaSubsession object from the GET_PARAMETER call handler.
You would do this (from the "RTSPClientS
Hi,
I'm just a newbie, so sorry for a stupid question.
I want to add some parameters to the GET_PARAMETER call, like position.
Now I found a function MediaSubsession::getNormalPlayTime which i could
use (i guess), but I have no clue how to access a MediaSubsession object
from the GET_PARAMETER
Thanks for the help and the quick response Ross,
I'll post my findings once I've figured it out,
Ralf
--
This message is subject to the CSIR's copyright terms and conditions, e-mail
legal notice, and implemented Open Document Format (ODF) standard.
The full disclaimer details can be found at
>How would I transfer control of the sending of messages to the
>DirectShow pipeline i.e. every time a frame arrives in my DirectShow
>filter I want to send an RTP packet instead of the letting the
>aforementioned event loop of the live library schedule events?
You can't - you must use the even
Hi Ross,
Thanks for your previous feedback, it's helped me get up and running quickly ;)
Following your advice I've managed to do the following:
I've written an RTP server and client based on the testMp3Streamer examples in
which I'm just streaming a bunch of random data from the server to the cl
18 matches
Mail list logo