On Fri, Jan 16, 2015 at 12:41 AM, Jonas Sicking <jo...@sicking.cc> wrote:

> FWIW, a difference in load time of 100ms is quite big. Websites like
> Amazon has measured significant changes in clickthrough rates when
> they have experimentally increased loadtime by 100ms.
>

I believe that significant resources have been dedicated to the general
problem, and for far less than that.  I once had access to some numbers on
this, and improvements in the small 10s of ms would easily justify the cost
of a data center.  I don't have specific numbers, but I believe that 3ms
was considered enough cause to spend a fairly shocking amount money for a
large company.

100ms is an absurdly large improvement.


> That said, privacy is definitely very important. But given that this
> has gone through privacy review by the mozilla privacy team I'll trust
> that this feature has been implemented with privacy in mind.


The extent to which browsing history can be recovered by a passive network
observer based on this feature alone is hard to say.  The fact that we
spray DNS requests for every <a href=""> on a page is probably a worse
leak...or, if you think about it a little more, providing of excellent
k-anonymity.  Good luck recovering any signal when there is that much
noise.  Critically, one-off or infrequent navigation events won't trigger
the heuristic.
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to