> I can't find any example to use "ByteStreamMultiFileSource", can you help me
> please!
"ByteStreamMultiFileSource::createNew()" takes a NULL-terminated array of file
names. E.g.,
char* ourFileNames = new char*[3+1];
ourFileNames[0] = "ourFile0.ts";
ourFileNames[1] = "
> I have used the two "MPEG2TransportStreamIndexer" and
> "testMPEG2TransportStreamTrickPlay" test programs to index a TS file
> ("testrec.ts"), and then trickplay it with 4x speed:
> MPEG2TransportStreamIndexer testrec.ts
> The index file "testrec.tsx" is created (This file is uploaded here for
Thank you ross and chris. It works !! J
From: live-devel [mailto:live-devel-boun...@ns.live555.com] On Behalf Of
Ross Finlayson
Sent: Thursday, May 01, 2014 10:57 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] Frames are corrupted
On May 1, 2014, at 9:58 AM,
It sounds to me like the OP is trying to write a DirectShow source filter,
which she failed to mention.
If that is the case I'm not sure if reading from stdin would work.
Rather she should run the live555 code in its own thread and then pass on
incoming media samples to the DirectShow source fil
On May 1, 2014, at 9:58 AM, Vikram Singh wrote:
> Hi ross,
> I am not able to get SPS and PPS units from the encoder.
> I am using CUDA Video Encode library which has a function NVGetSPSPPS().
[...]
>>
>> Sorry I am not getting the detailed documentation for the function
>> NVGetSPSPPS(). Ple
Hello,
> 00 24 67 4d 40 1e f6 04 00 83 7f e0 00 80 00 62 00 00 07 d2
00 01 d4 c1 c0 00 00 27 a1 20 00 02 62 5a 17 79 70 50 00 04 68 ee 3c 80
>Total of 44 bytes.
It looks like the buffer is formatted with the 16-bit length of the NAL
unit, then the NAL unit data. So you can s
Hi ross,
I am not able to get SPS and PPS units from the encoder.
I am using CUDA Video Encode library which has a function NVGetSPSPPS().
NVGetSPSPPS() returns a buffer to SPS and PPS.
The problem is that I don't have the formatting for this buffer so that I
could separate SPS and PPS units f
> I don't really unsterstand the role of this function
> :env->taskScheduler().doEventLoop(); ?
This is explained in the FAQ
http://www.live555.com/liveMedia/faq.html#control-flow
LIVE555-based applications are event-driven, using an event loop.
> So, to put the stream in my filter at t
>
>> Actually, I want to do something like the testMPEG2TransportReceiver
>> that
>> I find
>> in the "LIVE555 Streaming Media" testprogs. This code reads a MPEG
>> Transport/RTP stream (from the same multicast group/port), and outputs
>> the
>> reconstituted MPEG Transport Stream to "stdout".I