I added profiling markers for HTTP requests, JS execution, and synchronous DOM parsing and took a look at some sites with known pageload performance issues. The results have been... illuminating. Besides clear inefficiencies that became noticeable (reading max 1kb from HTTP response bodies at a time??), I have also just discovered that the first 16 seconds of cnn.com are dominated by Servo requesting 356 different web fonts.

What I'm trying to say is that this tool is an absolute delight to work with and is helping me verify my hunch that speculative HTML parsing is not what is holding us back yet. Thank you very much for creating it!

Cheers,
Josh

On 2016-04-28 3:28 PM, Nick Fitzgerald wrote:
As of https://github.com/servo/servo/pull/10694, you can dump the profiling
data from the `components/profile/time.rs` module as a timeline in a self
contained HTML file that is easy to share/upload/etc.

Usage:

    $ ./mach run -r -p 5 --profiler-trace-path ~/output.html
https://reddit.com/

More info/docs here:
https://github.com/servo/servo/wiki/Profiling#generating-timelines

Screenshot:
https://camo.githubusercontent.com/44afcc7af5e12f5f14726d26b0ec975c19da6e47/687474703a2f2f692e696d6775722e636f6d2f4f6857736d386d2e706e67

Live example:
http://media.fitzgeraldnick.com/dumping-grounds/trace-reddit.html

My hope is that this will be a useful tool to visualize where time is spent
at a high level over time.

Cheers!


_______________________________________________
dev-servo mailing list
dev-servo@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-servo

Reply via email to