xpert at this, so maybe someone can shed more light on the
subject...
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Stas Desyatnlkov
Sent: Thursday, October 01, 2009 11:52 AM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-d
Oops, I actually use
SimpleRTPSink* sink = SimpleRTPSink::createNew(*env, rtpGroupsock, 33, 9,
"audio", "mp2t", 1, True, False);
I see the packets transmitted in WireShark, TS PID is 0xC0. VLAN can't play the
resulting stream. In addition, the pts of the rtp is always 0.
If I use VLAN to st
Hi,
I'd like to use the sample wis-streamer to encode the wav file into AAC and
stream in TS over RTP.
Does the following sequence makes sense:
WavFileSrc* pcm = new WavFileSrc(*env, filename);
//Create AAC encoder filter
audioSource = AACAudioEncoder::createNew(*env, pcm, 1, 80
: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Jeremy Noring
Sent: Tuesday, September 29, 2009 11:24 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] H264 multicast streaming question
On Tue, Sep 29, 2009 at 10:43 AM,
e
Subject: Re: [Live-devel] H264 multicast streaming question
On Thu, Sep 24, 2009 at 12:46 AM, Stas Desyatnlkov
mailto:s...@tech-mer.com>> wrote:
Hi,
Its obvious that loss of the SPS or PPS results in a lot of grief in the h264
land. The question is what choice do we have in case of no
Hi,
I'd like to add telemetry data along with A/V streams to my TS.
I planned to subclass MPEG2TransportStreamFromESSource and add a method like
addNewTelemetrySource. The telemetry data arrives via serial port on my device
and my source will feed the data to TS as soon as its available. Does thi
Hi,
Its obvious that loss of the SPS or PPS results in a lot of grief in the h264
land. The question is what choice do we have in case of no other means of
communication besides RTP?
What if the h264 stream is packed inside TS and receiver is not aware of
anything else?
I guess in case of LAN st
Hi,
I'm having the same problem (trying to stream NAL packets in TS).
So, it looks like there is no way to calculate the timestamps for the MPEG TS
once the AVC NALs are saved into a file. Its probably possible to read SEI, SPS
NALs and get the idea of the approximate bitrate of the AVC stream a
That's true, the packaging of the NAL units into MPG2 TS looks like a waste of
bandwidth. But this is what the client application expects.
They want us to add video, audio and some custom data into TS and stream it
over to a standard player (Elecard).
There is a standard way to put NAL byte strea
Hi all,
I've a file reader from H.264 raw NAL samples saved as received from the
encoder. I'd like to stream it (with the correct timestamps) inside the MPG2 TS
over RTP.
What is the best way to do it and how do I set the correct timestamps of the TS
packets?
__
d the streamer correctly or is it?
Question is will the VLC and MPlayer be able to play this right once such a
FramedSource sibling is created?
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Stas Desyatnlkov
Sent: Monday, August 31, 2009 2:49 PM
To: LI
The problem is the lack of info, I still don't know how to send NAL packets in
TS correctly. Is there a spec or RFC that can help?
Can I create my own FramedFileSource that will read the NAL packets correctly
and feed the data to the MPEG2TransportStreamFromESSource?
Or should I strip the NAL he
Hi All,
I'm trying to stream (unicast) an H264 from elementary stream file. The final
goal is to stream NAL packets from a hardware encoder but this should be easily
adopted once the file streamer works.
The ES file was created by a remux software and is playable by mplayer.
The problem is my s
Media - development & use
Subject: Re: [Live-devel] live object delete sequence
the answer is yes for both ...
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Stas Desyatnlkov
Sent: Monday, August 24, 2009 2:20 PM
To
:live-devel-boun...@ns.live555.com] On Behalf Of Stas Desyatnlkov
Sent: Monday, August 24, 2009 12:26 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] live object delete sequence
That is what I'm doing.
Will the rest of the objects get disposed too?
From: live-d
e555.com] On Behalf Of Stas Desyatnlkov
Sent: Monday, August 24, 2009 11:45 AM
To: live-de...@ns.live555.com
Subject: [Live-devel] live object delete sequence
Hi,
In my app I have to create and delete live objects (scheduler, usage
environment, rtp sink, rtcp, video source, stremer...) multiple times.
Hi,
In my app I have to create and delete live objects (scheduler, usage
environment, rtp sink, rtcp, video source, stremer...) multiple times.
What should be the proper delete sequence for the above? Does deleting the
usage environment also destroys scheduler?
Do I have to delete RTP and RTCP s
Hi All,
I need to find a way to emulate NAL packets in order to send it in TS. My
hardware encoder will produce NAL but as it won't be ready for a while, I'd
like to write my streamer with some file as a source.
I have a TS file with h.264 video and AAC audio.
How do I get the NAL packets from
ct use of 'select' to avoid packet loss in
Linux+BSD; correct use of WSAGetLastError and codes
On Thursday 16 April 2009, 09:41:53, Stas Desyatnlkov wrote:
> [...] But on both systems it
> consumes too much of the CPU. I tried using live555 for the reception of
> 100 audio stream
Select is implemented on Windows differently. But on both systems it consumes
too much of the CPU. I tried using live555 for the reception of 100 audio
streams (g.711). It consumed about 10% CPU on my 3Ghz machine.
I end up rewriting it with a simple set of asynchronous sockets, the CPU
consumpt
I will. Its just there is no clear indication as to how this should be done. On
the other hand there are commercially available solutions (e.g. Elecard
demuxer) that receives and handles the AVC + AAC TS.
These solutions are available on Windows and there is no way of telling it that
my implemen
That is exactly the question - how should I modify the
MPEG2TransportStreamMultiplexor code to put the NAL packets correctly into the
resulting TS?
-Original Message-
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson
Sent: Tue
devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Stas Desyatnlkov
Sent: Monday, April 06, 2009 8:39 AM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] H.264 + AAC + data in TS
Whatever, as long as its standard. So, any links?
-Or
Whatever, as long as its standard. So, any links?
-Original Message-
From: live-devel-boun...@ns.live555.com
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson
Sent: Monday, April 06, 2009 1:32 AM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel
Hi Ross,
Thanx for the quick response. Before I start reading code can someone point me
to an RFC or other documentation describing how the NAL units should be packed
into the TS?
I understand that the better way is to just send the NAL units without TS but
my client has an application that expe
Hi All,
I need to create a streamer for my embedded solution running Linux. The
streamer receives H.264 video and AAC audio packets from hardware encoders and
also a stream of some telemetry data. It should pack all of it into MPEG2 TS
and send it over LAN (no QoS needed).
Questions:
1) I
Hi All,
I need to stream H264 video from linux device to PC. I write both the sender
and receiver part but would like to comply to the standard way of streaming it,
so:
How should I packetize the video? Should it be NAL packets or should I put it
into transport stream?
Regards,
Stas
_
Hi All,
I have a g.729 sink class based on MediaSink.
1) How can I receive the sender IP (I need to tell between the senders to
the same multicast address)?
2) I'd also like to receive the sequence number and CSRC identifier at the
sink level.
Any ideas?
__
28 matches
Mail list logo