Thanks.
Happy New Year!
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson
Sent: Thursday, December 31, 2009 1:07 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Sending lots
but then, in case of multiple NAL's, the frame or the fragmenter will
try to fragment the data buy themselves, which I donw want to happen. I
want to use the NAL size as received from my hardwre encoder.
Is there a way from within a single call to doDeliverFrame to send all
the NAL's (send more t
lets say that I do read all available data from the socket on the first
call to doDeliverFrame which was triggered by the data arriving to the
socket.
but if I feed the framer only in one NAL at the first call to the
doDeliverFrame, what will trigger the call to the next doDeliverFrame
(since the s
Thanks Ross,
what do you mean by "treat the data as a set of (>=1) NAL units."? do
you mean that I should read it and use the fNumOfTruncatedBytes if
needed?
My hardware is alreay generating NAL's with the correct size.
what is the correct way if I want to work with discrete NAL? how can I
force
Hi All,
I am using the LIVE555 to stream h.264 bitstream
My source is making NALS with limited size so each frame may contain
more than 1 NAL.( I recived all of them together from the HW).
After I received them I am creating small descriptors describing each
NAL (pointer, size) and send ALL of th
timestamp in video in most cases (depends on the sdp) is 9 ticks in
second
for audio is according to the sample rate 8000/16000/44000 etc..
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Lee Victor
Sent: Tuesday
1. not really, but you can use it for that purpose (like I did)
2. I am adding few template for function of MyVideoRTPSink which is
derived from H264videoRTPSink but with extension support
3. make sure that your extension header is according to the standard
4. make sure that you are working with n
Hi
I think that a better solution is to disable the liveness check.
this can be easily done by calling the RTSP constructor with
reclamationTestSeconds = 0;
fRTSPServer = RTSPServer::createNew(*fEnv, serverPort,
authDB,reclamationTestSeconds);
Amit Yedidia.
Hi,
I recall such problem on my previous workplace.
I am not sure but I think that the solution was to send the SPS & PPS
in-band.
Amit Yedidia.
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Dan DuFeu
Sent: Tue
The best way is called OutOfBand
during the negotiation (RTSP/SIP), the sneder sends an SDP file. this
sdp conatni the sprop-parameter-set.
the reciver should get it, decode it (from Base64) and save it in the
beginning of the file.
From: live-devel-boun...@ns.liv
ample do I have to delete rtp_sock & rtcp_sock? Can I
safely delete rtcp pointer after the server is stopped?
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Yedidia Amit
Sent: Monday, August 24, 2009 12:47 PM
To: LIVE555 Streaming Media - developmen
ve555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Yedidia Amit
Sent: Monday, August 24, 2009 11:57 AM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] live object delete sequence
you should call reclaim() on the usageEnviroment, and then delete the
ta
you should call reclaim() on the usageEnviroment, and then delete the
task schedualer.
pEnv->reclaim();
delete pTaskSchedualer;
Amit Yedidia
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Stas Desyatnl
Hi all,
Is it possible to disbale the fact that the streamer is waiting for
RTCP-RR from the receiver, and if not received withing a period of time
the session is termianed by the streamer?
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel: 972-4-8318905
---
The problem is that special headers called SPS and PPS (sequence
parameter set and picture parameter set) are not included in the file.
Those headers may be carried in-band (which is probably not your case)
or in the SDP.
My guess is that your source sent them in the SDP and not in-band (in
the RT
Try rename it to have the extension .264. ("hellovideo-H264-2.264")
In that way the vlc player will know to treat it as annex-B H264 stream.
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel: 972-4-8318905
--
> Hi all,
>
> I know this not exactly the right place to ask this so feel free to
> ignore me.
>
> I want to use the task schedualrer provided in the live555 for (among
> all things) waiting for data on the serial port.
> On linux it is very easy (its allways easy in linux...) since the com
>
Hi all,
I know this not exactly the right place to ask this so feel free to
ignore me.
I want to use the task schedualrer provided in the live555 for (among
all things) waiting for data on the serial port.
On linux it is very easy (its allways easy in linux...) since the com
port is a file descr
yes.
you can use the OnDemandSubsession, with settting reuseSouce=1
In addition there is a support in multi-unicast.
This will generate rtp stream toward each client from the same source
(duplicate the packet)
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel:
thanks :)
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel: 972-4-8318905
--
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live55
Hi All,
I am using the fWatchVariable to notift the eventLoop to exit. (as it
should be)
However if no connection is avaialble (nothing is waking the loop), this
variable will not be tested and the eventLoop is blocked.
Is there some "refresh" command to the loop that will force it to wake ,
so
Hi,
I am trying to extend RTSPClientSession to support "SET_PARAMETER".
However the class is declared as private within the RTSPServer class.
In addition the IncomingRequestHandler1 which I should override, is also
declared as private within the RTSPClientSession class.
Should I change both of
The first one is very good.
I didn't read the others.
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel: 972-4-8318905
--
> -Original Message-
> From: live-devel-boun...@ns.live555.com
> [mailto:
Hi All,
How can I control the sequence done by CTRL-C when exiting the
executable?
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel: 972-4-8318905
--
The information in this e-mail transmission contain
yes. using Mpeg2TS (Transport stream)
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel: 972-4-8318905
--
From: live-devel-boun...@ns.live555.com
[mailto:li
Thanks
Zhu yunbin
zhuyunbin
2008-12-15
________
发件人: Yedidia Amit
发送时间: 2008-12-14 23:26:13
收件人: LIVE555 Streaming Media - development & use
抄送:
主题: Re: [Live-devel] How to us
first, you should implemet H.264 Framer (you can search for the tutorial
republished few days ago)
second, you shoud write your own DeviceSource which will make use of
your proprietary API.
Regards,
Amit Yedidia
Elbit System Ltd.
Email: amit.yedi...@elbitsystems.com
Tel: 972-4-8318905
few things you must notice:
1.doGetNextFrame should NOT call deliver frame. it should only scheduled
deliverFrame in the taskSchedulaer waiting on some socket or the
watchVariable.
2. you call startPlaying only once!!! (when you want to start playing -
it should not be used to notify new data.
Thank Ross.
I was thinking the same way but got problem with the "describe". When
calling sdpLine() no parameter is being passed ,which make it very
difficult for distiguishing between different clients.
I'm using multicast sessions which may cause some clients not to send
"setup".
any ide
Hi
Does anybody knows how play the RTP packets,I used live555 received
the RTP packets.
thanks advance
在2008-12-01,"Yedidia Amit" <[EMAIL PROTECTED]> 写道:
Hi,
Following my previous questions I want to creat
Hi,
Following my previous questions I want to create a situation where one
source is sent to two different multicast address.
For that I am using passiveMediaSubsession where I add both destination
to the sink.
I want to generate 2 sessions that whoever asks "describe" for the first
will receive
If I will write new SetUpOurSocket() that will create new UDP socket (instead
TCP), and the RTSP Server will listen tothis socket.
Will it work?
Are there any other changes that must be don?
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
Seems that no one can fint this tutorial :)
You can ask what you want and we will try to answer.
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
> -Original Message-
> From: [EMAIL PROTECT
Hi All,
Does Live supports RTSP over UDP? ("rtspu://.")?
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
The information in this e-mail transmission contains proprietary and business
sensitive
I know. Currently I working "non-standard". RTP-Over-TCP is not fitted
due to the overhead. I have very very strict limitations on my
bandwidth.
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
> -
Thanks Ross.
1. Does SLIP itself requires Full Duplex channel (I don't RTCP RR nor
TCP) ?
2. Yes .I do expect moderate data loss and bit errors. Any ideas?
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
---
ay, November 20, 2008 2:33 PM
> To: LIVE555 Streaming Media - development & use
> Subject: Re: [Live-devel] RTP sink to serial
>
> Yedidia Amit wrote:
> > But I am limited to 1006 bytes datagram which increase the UDP/RTP
> > header overhead.
> >
>
Hi All (again),
I am planning to use the Live to forward RTP packets.
I know I need to build new subssesion for this job, which will connect
UDPSorce with UDPSink
My question is: can I use the same groupsock for both receiving and
sending (receive from one IP1 and sent to IP2),
or should I need 2
But I am limited to 1006 bytes datagram which increase the UDP/RTP
header overhead.
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailt
& use
Subject: Re: [Live-devel] RTP sink to serial
Yedidia Amit wrote:
I want to to send RTP packets (RTP header + payload
after packetization) to a serial port.
From my experience with Live I figure that using
socket_pair wil
I think that you should at least attach some errors, so members will be
able to give you more specific answers.
btw, did you first try to compile it for windows/linux (just to get some
experience) you might discover that your problem is not related to the
DM642.
Regards,
Amit Yedidia
Elbit Sys
tahnks,
Regrading your last comment - I thought I can use 127.0.0.1 or the real
machine IP. Am I missing something?
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
__
Hi All
Is it possible to add more than one multicast address to RTPSink as
destinations?
Which means that the the packet will be transmitted once for each
multicast address/
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
Hi,
I want to to send RTP packets (RTP header + payload after packetization)
to a serial port.
>From my experience with Live I figure that using socket_pair will be a
good solution.
In that way I will be able to use the current code to send the data into
the socket, and on the other side I will w
thanks
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Ross Finlayson
Sent: M
Hi Ross,All.
I want to implement H264VideoStreamFramer to delievr H264 Nals.
I also want that the same framer will be used no matter if the source
ByteStreamFileSource or DeviceSource.
I implemented my device source as a readable socket which the encoder
send nal after nal into it. (I am receivin
Hi Ross, All
1.Why the parsers in the Live libarary are working with try and catch?
2. If I am implementing DeviceSouce which deliver H264 nals threw socket
(as you recommended), how should I know which nal is the last NAL fo the
AccessUnit (last nal in frame)?
Regards,
Amit Yedidia
Elbit Sys
I mean to RTP Header Extension. and thank you for your answer.
what I am trying to do is adding meta data per frame to a H264 RFC3984
stream.
another way which cross my mind is to put this meta data in NAL with
unspecified NAL type (24-31).
But I need it to interop with clients which dont support
Hi,
What is the proper way to imlement adding RTP extension in Live555?
Thanks,
Amit Yedidia
The information in this e-mail transmission contains proprietary and business
sensitive information. Unauthorized interception of this e-mail may constitute
a violation of law. If you are not the in
Hi Ross, All
I noticed that when some Framer is using a Parser for reading from
ByteStreamFileSource an afterGetting call is schduled immediately to
taskSchedulaer (scheduleDelayedTask) for the same function who
initially call the parser and an exception is being used to return the
Framer scope.
Why not using rtp extension?
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of
> Sigismondo Boschi
>
I think its beacuse both are trying to use the same port (8554)
Regards,
Amit Yedidia
Elbit System Ltd.
Email: [EMAIL PROTECTED]
Tel: 972-4-8318905
--
From: [EMAIL PROTECTED]
[mailto:[EMA
Ross,
I am still tyring to understand the live capture issue.
I imlemented the framer (H264LiveVideoStreamFramer) and the source
(ImxDeviceSource)
Now, when the following chain is being called:
H264FUAFragmenter::doGetNextFrame() ->
H264LiveVideoStreamFramer::doGetNextFrame() ->
ImxDeviceSource:
Hi Ross
I am trying to imlement H264VideoStreamFramer but got some difficulties.
On the first time that H264VideoStreamFramer::continueReadProcessing()
is running it called fParser->parse(frameDuration).
This call is return with exception, which according to other examples is
expected.
However
Hi Mike
I would like to know if there is an updated version for the
H264VideoStreamFramer.cpp you published few months ago?
Thank you very much.
Regards,
Amit Yedidia
The information in this e-mail transmission contains proprietary and business
sensitive information. Unauthorized intercep
Thanks Ross,
It helps a lot.
So the deliverFrame() should handle what is needed to get the encoded
data from my encoder.
and doGetNextFrame() should call deliverFrame().
1. Does deliverFrame() should be blocking tillit acquire new frame data?
2. who initiate the call:
Does the ecoder notify th
Hi,
I am imlementing live RTSP video streaming server which capture video
from live camera, encoded it in hardware and its output are NAL units.
I noticed the DeviceSource template and got some questions about using
it.
1. Does my flow should be
DevideSource::doGetNextFrame -> H264Frame
57 matches
Mail list logo