Hi there,
I'm using ffmpeg to generate MPEG DASH & HLS content using a backing fragmented
MP4 asset, and I've noticed some inconsistencies with ffmpeg that I wasn't sure
about.
For reference, I'm running the following commands:
Generate a fMP4 backed HLS stream:
ffmpeg -i "${FILE}.mp4" -codec copy -hls_time 0.975238095238095
-hls_segment_type fmp4 -hls_flags single_file+append_list -hls_playlist_type
vod "${FILE}_1sec_v7.m3u8"
Generate a fMP4 backed MPEG DASH/HLS stream:
ffmpeg -i "${FILE}.mp4" -codec copy -f dash -single_file_name
"${FILE}_1sec.m4s" -min_seg_duration 975238.095238095 -hls_playlist 1
"${FILE}_1sec.mpd"
My questions are as follows:
1. The first command above generates an HLS manifest with version 7. The second
command generates an HLS manifest with version 6. Why are they different, when
they are using the same exact set of features from the HLS protocol definition?
Shouldn't both be HLSv7 (the first version where Apple supports fMP4 backing
assets)?
2. Is it possible to use a fMP4 and a SegmentTemplate/SegmentTimeline with byte
ranges? A fMP4 has inherent benefits over numerous separate .m4s files, but
then we have to pay the price of a really long manifest file. Why can't ffmpeg
templatize the byte ranges themselves as well?
3. Is ffmpeg planning on supporting HLSv8 soon? That will allow us to use
variable substitution in HLS, just like we can with MPEG DASH. Is there an ETA
on when this would be available in ffmpeg's trunk?
Thanks,
Ronak
_______________________________________________
ffmpeg-devel mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel