On Thursday, September 4, 2025 at 2:47:24 PM UTC+2 Steinar H Gunderson 
wrote:

On Thu, Sep 04, 2025 at 02:34:08PM +0200, Yoav Weiss (@Shopify) wrote: 
> Thanks for the description of the issue! 
> Is there any way to estimate how many sites will be impacted by the 
> slowness here? Do we have a use counter of some sorts? For ones that are 
> impacted, how much slower do they become? Are they still usable? 

Yes and no. We want to roll this out via Finch, which will at least give us 
confidence that it won't have large-scale problems across the web. As for 
individual sites, it really depends on a lot of factors; in particular, 
how big are the DOMs, how many custom properties does each element have, 
how patient is the user? For most cases, the difference is basically zero. 
If you have 2000+ custom properties on each element (something that we 
don't 
recommend, but some extreme sites do it nevertheless), you are talking 
going 
from 15 seconds to several minutes. 


That sounds equivalent to "completely broken".
 

It is still usable, but the user will 
probably eventually tire of waiting if the long is long enough. 

It is challenging to make an accurate use counter for this. We could have a 
counter like “you are accessing custom properties on gCS and you have more 
than 500 of them”, but it would have large amounts of false positives. 
For instance, this isn't a problem if you also have a small DOM. It's not a 
problem if you see them and immediately skip over them instead of cloning 
them onto the new element (the recommended patches to html2canvas and 
html-to-image do this). Similarly, I suppose “you are using setProperty to 
set more than 500 properties on an element” is going to be very imprecise. 

This is why we've been using HTTP Archive to try to quantify the impact 
manually, however imprecise. And that tells us that while it's common to 
include these libraries, actually hitting them in normal use is not that 
common, at least in our normal as-a-user experience.


I can definitely sympathize with the challenge.  But as is, you're 
essentially telling the API owners that an unknown amount of websites will 
stop working without warning, and without recourse other than their 
developers (if any) actively coding their site to use a different library. 
IMO, that's a hard sell without more data that would help us better 
quantify the damage.



> Would we have some way of pointing the developers of such sites to 
snapDOM 
> or other solutions? 

None beyond the bug reports; we've engaged both on the Chromium bug report 
we 
received as well as bug reports in the relevant upstream libraries 
(including 
pdf.js, which is popular and uses html2canvas), and our experience is that 
developers have few problems adjusting as long as they know what the fix 
is. 

Of course, they've already had to contend with this slowness for Firefox 
and 
Safari users; some of them may not care, though. But fixing it for M141 
will also fix it for users of those browsers. 

/* Steinar */ 
-- 
Homepage: https://www.sesse.net/ 

-- 
You received this message because you are subscribed to the Google Groups 
"blink-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/a/chromium.org/d/msgid/blink-dev/0868bb7d-87a9-426f-869a-9a9f4cacd30en%40chromium.org.

Reply via email to