On 11/25/2014 10:28 AM, Bobby Holley wrote:
On Sun, Nov 23, 2014 at 4:43 PM, Mark Hammond
wrote:
* If it is supposed to be used with a normal return, is the change so
GetPendingResult() is called the correct approach to take? (ie, should I
open a bug with that as the patch?)
At a high level
On Tue, Nov 25, 2014 at 10:50:45PM -0800, Andreas Gal wrote:
>
> Would it make sense to check in some of the libraries we build that we
> very rarely change, and that don’t have a lot of configure
> dependencies people twiddle with? (icu, pixman, cairo, vp8, vp9). This
> could speed up build times
Would it make sense to check in some of the libraries we build that we very
rarely change, and that don’t have a lot of configure dependencies people
twiddle with? (icu, pixman, cairo, vp8, vp9). This could speed up build times
in our infrastructure and for developers. This doesn’t have to be i
The new XULRunner 34 fails with "Couldn't load XPCOM" error when running
MacOSX. It works fine in Win32. Besides, FF34 works fine too. Anyone any
idea what caused this? XULRunner 33 works fine in MacOSX too. I'm using OS
X Yosemite.
___
dev-pla
On 11/25/2014 05:45 PM, Reuben Morais wrote:
On Nov 25, 2014, at 13:22, Gijs Kruitbosch wrote:
On 25/11/2014 14:22, rayna...@gmail.com wrote:
I need to get the audio sample data and do some math on it, then play it in the
speaker, with the minimum of latency (arround 20ms).
Only the wasapi d
On Sun, Nov 23, 2014 at 4:43 PM, Mark Hammond
wrote:
> * Is Components.returnCode expected to be used when the code throws (as
> SessionStore.jsm does) or when the code returns without an exception? (Or
> maybe both?)
>
It doesn't look like it's used much in the tree, but it seems like the one
u
Gecko should be able to make 10-40ms audio round trips, including
processing.
It is of course using WASAPI behind the scenes, and the latency will
depend on the audio hardware. Gecko will try to use the lowest latency
possible in any case (both on input and output).
Then again, if you need to wri
> On Nov 25, 2014, at 13:22, Gijs Kruitbosch wrote:
>
> On 25/11/2014 14:22, rayna...@gmail.com wrote:
>> I need to get the audio sample data and do some math on it, then play it in
>> the speaker, with the minimum of latency (arround 20ms).
>>
>> Only the wasapi driver could allow this.
>
> H
Sorry the service fell over on the weekend so I'm missing data for
Sat-Mon. I just posted a report up to and including last Friday though:
http://brasstacks.mozilla.com/testreports/weekly/2014-11-21.informant-report.html
Daily reports should be uploaded again starting tomorrow:
http://brasstacks
Test Informant report for 2014-11-21.
State of test manifests at revision 5ba06e4f49e8.
Using revision a52bf59965a0 as a baseline for comparisons.
Showing tests enabled or disabled between 2014-11-15 and 2014-11-21.
87% of tests across all suites and configurations are enabled.
Summary
---
On 25/11/2014 14:22, rayna...@gmail.com wrote:
I need to get the audio sample data and do some math on it, then play it in the
speaker, with the minimum of latency (arround 20ms).
Only the wasapi driver could allow this.
Have you actually tried using getusermedia/web audio for this? Or are
y
Hi Anne,
On Tue, Nov 25, 2014 at 9:13 AM, Anne van Kesteren wrote:
>
> > They are doing this with opportunistic encryption (via the
> > Alternate-Protocol response header) for http:// over QUIC from chrome.
> In
> >
>
> Or are you saying that
> because Google experiments with OE in QUIC, inclu
> Why do you need to access those drivers/what are you using them for?
Hi,
I need to get the audio sample data and do some math on it, then play it in the
speaker, with the minimum of latency (arround 20ms).
Only the wasapi driver could allow this.
The calculus on the samples depends of web p
On Fri, Nov 21, 2014 at 5:44 PM, Patrick McManus wrote:
> On Fri, Nov 21, 2014 at 10:09 AM, Anne van Kesteren
> wrote:
>> Why would they be allowed to use OE?
>
> The reasons why any individual resource has to be http:// and may (or may
> not) be able to run OE vary by resource. Of course only th
On Tue, Nov 25, 2014 at 2:28 PM, wrote:
> If I read correctly, asm.js execute code in a sandbox. I don't think a
> sandbox let me access to the wasapi drivers of the soundcard.
Why do you need to access those drivers/what are you using them for?
Cheers,
Dirkjan
___
Hi,
If I read correctly, asm.js execute code in a sandbox. I don't think a sandbox
let me access to the wasapi drivers of the soundcard.
Best regards,
Philippe
On Tuesday, November 25, 2014 1:37:40 PM UTC+1, Dirkjan Ochtman wrote:
> I think the future-proof solution would be to compile your
On Tue, Nov 25, 2014 at 11:20 AM, wrote:
> My plugin is a link between web page and sound processing. I record the
> voice, apply some math, and inject in real time in the headset.
> It's a like a guitar effect but for the voice.
>
> Do you know how I could interface C++ and Web?
> On Internet E
On 25/11/2014 10:40, Kan-Ru Chen (陳侃如) wrote:
Hi,
Currently we have many tests that are skipped for various reasons. Do we
have data on which test runs on which platforms? For example if a test
is accidentally skipped on all platforms, could we identify it?
Kanru
A tool called "Test Informan
Hi,
Currently we have many tests that are skipped for various reasons. Do we
have data on which test runs on which platforms? For example if a test
is accidentally skipped on all platforms, could we identify it?
Kanru
___
dev-platform mailing list
dev-p
Hi,
I maintain a plugin using NPAPI and I'm looking for durable solution.
My plugin is a link between web page and sound processing. I record the voice,
apply some math, and inject in real time in the headset.
It's a like a guitar effect but for the voice.
Do you know how I could interface C++
20 matches
Mail list logo