On 2/3/14, 9:13 AM, Boris Zbarsky wrote:
On 2/3/14 11:48 AM, Gregory Szorc wrote:
 > what's the impact of this on performance?

It's hard to say without an example of the sort of code whose
performance we're worried about.

 > because promises involve multiple function calls instead of a single
 > function call.

Note that in general not all function calls are created equal....

What's generally slow-ish is calls from C++ into JS, whether those
happen in the DOM or in the JS engine.  And single-callee calls from JS
into C++ are a bit slower than single-callee calls from JS into JS,
since the latter can get inlined.

Now whether you're in a single-callee situation, whether your code is
getting inlined, and whether your code is even getting JIT-compiled
(e.g. your typical JSM is not in many cases) is all a bit unclear for
the cases we're talking about here.  Again, concrete code examples may
make this clear.  Or may not.  :(

What swayed me towards accepting
promises was a) generator magic (via Task.jsm) leading to much more
comprehensible and maintainable code

I should note that if you're using generators, you're clearly not _that_
worried about performance, in today's world.  At least for the code
inside the generator function, since that's not JIT-compiled at all so
far; it runs in the interpreter.

I am aware of this limitation and talked with a few SM devs about this concern. I was told something along the lines of "if generators get popular, the engine will adapt to optimize them." I understand that may be many months or years away.

Also, I believe chrome JS is never JIT compiled, so not getting JIT benefits with generators doesn't seem like a concern?

and b) knowledge that JS would grow
to accept promises natively

That will still happen, probably.  Of course they'll be C++ objects,
just like the DOM promises (and just like array objects, etc).

I have the (possibly incorrect) perception that JS <-> SpiderMonkey C++ is more efficient than JS <-> other C++ (including DOM code and especially XPCOM code). I'd love, love, love a brownbag or similar training materials for JS developer education here.

(presumably meaning that they'd eventually
get optimized by the engine).

That's possible, but which parts are you thinking about specifically?
Are you worried about the performance of promise creation (which
involves a call from C++ to JS, passing in the resolve/reject
functions), or resolve()/reject() calls on a promise, of then() calls on
a promise, or of the cost of promises invoking their callbacks (another
C++ to JS call)?

I'm worried about pretty much everything. Lots of Firefox features are now using promises over callbacks for their APIs. I worry about the explosion of promise usage contributing to a performance problem. I don't think we'd want to make that worse via excessive C++ bridging.

If we switch Promise.jsm to DOM Promises, AFAICT we're moving from
promises being 100% JS to involving a bridge to C++/DOM.

The bridge _to_ C++/DOM is pretty fast (think overhead on the order of
20 instructions per call).  The bridge back into JS is a different
story; that's probably 20x slower.

But again, it really depends on what the 100% JS code did.

Doesn't this add overhead and thus regress performance?

Hard to say without measuring.
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to