Hi,

I have an application that generates real time audio and in a system like this, 
a interruption may cause audio glitches.

Nowadays, in iOS, audio apps can be loaded by other apps. This transition takes 
time and probably some resources too.

While playing my app with the built in arpeggiator and then switching to 
another app, the notes would stop playing for a while, creating a non desired 
break in the flow.

The curious thing is that, if I had a single note playing, the sound was not 
being broken. 

Since the sound rendering engine is the same I couldn’t understand what was 
going on.

Then I made an experiment. Arpeggiator was using a signal to send the noteOn to 
the audio render engine. I decided to call the function directly on the synth 
render engine. Guess what, the problem disappears.

Is the signal/slot mechanism tied to the event loop? How much does it take for 
a signal/slot to be called? It might not be significant on a desktop computer 
but it certainly is on a ARM device specially in this kind of performance 
sensitive contexts.

After this, I’m into changing crucial functions to callbacks to improve 
performance. Is there any advice you could give me?

Thanks,

Regards,

Nuno

_______________________________________________
Interest mailing list
Interest@qt-project.org
http://lists.qt-project.org/mailman/listinfo/interest

Reply via email to