I was aware of it by description only. The "explicitly do not segment" and 
having to create the index after the file is completed were game killers for me 
:( .  I needed to segment and build the index in real time to restream a live 
source and have pieces time out and go away. Maybe I should of looked closer at 
it instead.  But maybe misguided, when I dug into the multiplexor code I found 
that the PAT and PMT were being added on an interval and had nothing to do with 
segment size or relative keyframe position. Something that I thought was 
essential for iOS.(it is "strongly suggested" LOL ) The cleanest method for me 
was to have a muxer for iOS that made a ts stream that makes segmenting and 
indexing almost trivial.

I am afraid I do not know what the trick play index looks like and thought the 
index mentioned was the .m3u8 being done after the fact.  I believe it is an 
external index and I unfortunately already have an external index and my own 
file format which is now our "legacy" for our archives. Can the trick play 
index be created and maintained on the fly? Or does it need , and need to be, a 
completed file?

For our live stream, there are no files on disk, it is all rolling virtual 
files in memory. What I gathered from the documentation was that the existing 
implementation was 1) serving on it's own port and 2) file based.

In my server, If a user gets the .m3u8 index and it has 3 - 5 second entries in 
it, ie segment1.ts, segement2.ts ,segment3.ts, and they come back 5 seconds 
later for the same index, they get an updated one that has segment2.ts, 
segment3.ts, segment4.ts. It implements a "sliding window" to quote the apple 
docs.

From: live-devel-boun...@ns.live555.com 
[mailto:live-devel-boun...@ns.live555.com] On Behalf Of Ross Finlayson
Sent: Tuesday, December 27, 2011 8:49 AM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] invalid ts stream

Sigh...  I think you might not be familiar with the HTTP Live Streaming server 
implementation that we *already* have (and have had for almost 6 months now).  
See:
            http://www.live555.com/mediaServer/#http-live-streaming

Our server can *already* stream a H.264-encoded Transport Stream file via HTTP 
Live Streaming.  We explicitly *do not* segment the file in any way.  Instead, 
we create (automatically, in our server) a playlist to serve to clients.  Each 
entry in the playlist refers (using time-based parameters in the URL) to a 
specific region of the file, but we do not actually segment the file.  Instead, 
our server delivers the appropriate portion of the file automatically.  Our 
only requirement is that the file be indexed (using our normal 'trick mode' 
indexing mechanism) beforehand.

And this works just fine - e.g., with a file made up from Apple's "bipbopgear1" 
Transport Stream example.

However, it *did not* work with a Transport File that I created - using our 
"testH264VideoToTransportStream" demo application - from the 
"CaptureH264.es<http://CaptureH264.es>" file that you provided - ***even 
after*** I modified our "MPEG2TransportStreamMultiplexor" and 
"MPEG2TransportStreamFromESSource" implementations to exactly match what you 
did in your code.

So, as I said before, until I know for sure what changes, if any, are necessary 
to the "MPEG2TransportStreamMultiplexor" and "MPEG2TransportStreamFromESSource" 
code in order for HTTP Live Streaming to work, I'm going to hold off - at least 
for now - on making any changes to this code.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

________________________________
No virus found in this message.
Checked by AVG - www.avg.com<http://www.avg.com>
Version: 2012.0.1901 / Virus Database: 2109/4706 - Release Date: 12/27/11
_______________________________________________
live-devel mailing list
live-devel@lists.live555.com
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to