converted to cmake

add this in your CMakeFiles.txt to make FluidSynth available to your project

# configure import libs
set(distribution_DIR ${CMAKE_SOURCE_DIR}/../../../../distribution)

### INCLUDE FluidSynth LIBRARY ###

# Get the absolute path to the FluidSynth library directory
get_filename_component(FLUIDSYNTH_DIR
                       ${distribution_DIR}/fluidsynth-android
                       ABSOLUTE)

# Add the FluidSynth library as a subproject. Since FluidSynth is an
out-of-tree source library we must also
# specify a binary directory
add_subdirectory(${FLUIDSYNTH_DIR}/android ./fluidsynth)


On Thu, Jan 18, 2018 at 12:58 PM, Philippe Simons <simons.phili...@gmail.com
> wrote:

> https://github.com/googlesamples/android-audio-
> high-performance/tree/master/SimpleSynth
>
> this a simple synth which generate a sin tone.
>
> just replace the implementation of int Synthesizer::render with a call to
> fluid_synth_write.
>
> This project use cmake to build the android native libraries, but you
> should be able to use the libfluidsynth.so built with the headers in the
> /include folder.
>
> I'll add the CMakeFiles.txt in my fork for easier integration...
> eventually.
>
> Philippe
>
> On Thu, Jan 18, 2018 at 12:44 PM, Phil Blandford <
> philip.blandf...@gmail.com> wrote:
>
>> OK, thanks - it's certainly easier to build, but I'm really not clear
>> where I should be calling fluid_synth_write_* - in a driver I've written
>> myself? Are there any examples I can follow?
>>
>>
>> On 18 January 2018 at 09:14, Philippe Simons <simons.phili...@gmail.com>
>> wrote:
>>
>>> The issue I have with an audio driver on Android, is that it will never
>>> cover all the uses cases that a dev might need.
>>>
>>> For ex:
>>> OpenSL is great for real-time MIDI rendering because it allows to work
>>> with very small buffers. (15ms)
>>> But for a MIDI player, working with small buffer is not a option (if the
>>> devices goes to sleep, it wont be able to catchup with the required speed
>>> to fill the buffer) and you need to work with buffers in the 500ms range or
>>> more with and AudioTrack object.
>>> OpenSL is also old tech on Android, while not deprecated, it's
>>> recommended to use AAudio starting with Android 8.0
>>>
>>> So my idea was to left that to the dev, it's really no big deal to call
>>> the fluid_synth_write_*() function to pull the audio, it also allows
>>> you to modify it before rendering (mixing, filtering, effects,...)
>>>
>>> Being free from the glib dep, my fork is also very light (+/- 250kb for
>>> each CPU arch), and almost up-to-date with https://github.com/FluidS
>>> ynth/fluidsynth
>>>
>>> Now it's up to you to use whatever solution fits your needs.
>>> Just my two cents.
>>>
>>> Philippe
>>>
>>> On Thu, Jan 18, 2018 at 5:29 AM, Phil Blandford <
>>> philip.blandf...@gmail.com> wrote:
>>>
>>>>
>>>> Ok, further progress, once I sorted out my own dumb JNI bugs..
>>>>
>>>> I can load a soundfont, play a MIDI file, but it rarely gets to the end
>>>> - it stops, and won't start again until I restart the test app. The logcat
>>>> is full of:
>>>>
>>>> [ 01-17 22:50:13.890  2651: 2717 D/         ]
>>>>                                              PlayerBase::stop() from
>>>> IPlayer
>>>>
>>>> once every millisecond (give or take), regardless of whether a file is
>>>> playing or not.
>>>>
>>>> I'm slowly getting up to speed on both fluidsynth architecture and the
>>>> Android audio subsystem, but it may be someone who knows more about either
>>>> or both could make better progress.
>>>>
>>>> The git log on the fluid_opensles.c file seems to indicate it hasn't
>>>> been touched in 2 years. But it does seem tantalisingly close, just a few
>>>> fixes away from being a real boon to Android developers who want to use
>>>> soundfonts in their apps and have a more flexible MIDI than the native
>>>> Mediaplayer gives them.
>>>>
>>>>
>>>>
>>>>
>>>> > Hi, this is also my first post to the fluidsynth list, so apologies
>>>> in advance for any inadvertent breaches of etiquette!
>>>>
>>>> > I've been struggling to build that same fork for a few days, and did
>>>> manage it in the end. The problems came down to:
>>>>
>>>> <snip>
>>>>
>>>> > Anyway, I've managed to create a basic JNI wrapper and got it working
>>>> in an app - sort of. I can load a soundfont, play a note with
>>>> fluid_synth_note_on/off, but the sound is rather distorted. I can tell it
>>>> to play a file - the logcat shows something is happening, but no sound is
>>>> produced. I'm a bit stuck now as I don't know enough about low-level audio
>>>> to debug.
>>>>
>>>> > I'd really like to hear if anyone has got this up and running.
>>>>
>>>>
>>>> _______________________________________________
>>>> fluid-dev mailing list
>>>> fluid-dev@nongnu.org
>>>> https://lists.nongnu.org/mailman/listinfo/fluid-dev
>>>>
>>>>
>>>
>>> _______________________________________________
>>> fluid-dev mailing list
>>> fluid-dev@nongnu.org
>>> https://lists.nongnu.org/mailman/listinfo/fluid-dev
>>>
>>>
>>
>>
>> _______________________________________________
>> fluid-dev mailing list
>> fluid-dev@nongnu.org
>> https://lists.nongnu.org/mailman/listinfo/fluid-dev
>>
>>
>
_______________________________________________
fluid-dev mailing list
fluid-dev@nongnu.org
https://lists.nongnu.org/mailman/listinfo/fluid-dev

Reply via email to