On 10.01.2018 09:11, Дмитрий Гуменюк wrote:
Hi,
In my opinion JSON is more flexible, so envelope can be rendered in web app or 
mobile app
This filter is useful when you’d like to generate visual representation of 
audio while doing transcode

I see, but my point is: If you do the image rendering external anyway, why not use an existing envelope dumping format? Or alternatively connect the rendering application via the av* library interface in C?

Filters into the codebase should be as generic as possible to allow re-usage by other filters.


On 10 Jan 2018, at 09:00, Tobias Rapp <[email protected]> wrote:

On 08.01.2018 01:36, [email protected] wrote:
From: Dmytro Humeniuk <[email protected]>
Signed-off-by: Dmytro Humeniuk <[email protected]>
---
  Changelog                      |   1 +
  libavfilter/Makefile           |   1 +
  libavfilter/af_dumpwave.c      | 273 +++++++++++++++++++++++++++++++++++++++++
  libavfilter/allfilters.c       |   1 +
  libavfilter/version.h          |   4 +-
  tests/fate/filter-audio.mak    |   5 +
  tests/ref/fate/filter-dumpwave |   1 +
  7 files changed, 284 insertions(+), 2 deletions(-)
  create mode 100644 libavfilter/af_dumpwave.c
  create mode 100644 tests/ref/fate/filter-dumpwave
[...]

As far as I can see the filter reads audio and writes an envelope curve using 
JSON data format. The JSON data is then rendered into an image outside of 
FFmpeg.

In my opinion it would be better to allow connecting the filter with other filters by 
directly rendering the image within FFmpeg. So the filter should have a generic video 
output instead of the JSON output. This would be similar to the existing 
"showvolume" filter.

If you just want to get the raw envelope data from FFmpeg I suggest to take a look at the 
"write_peak" option of the WAV muxer.

Regards,
Tobias



BTW: Top-posting is unpopular on this list.

Regards,
Tobias

_______________________________________________
ffmpeg-devel mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel

Reply via email to