Love it. How do we make it happen?

- mhoye

On 2017-03-24 1:30 PM, Tom Ritter wrote:
It seems like SubResource Integrity could be extended to do this...
It's specifically for the use case: where you kinda trust your CDN,
but you want to be completely sure.

-tom

On Fri, Mar 24, 2017 at 12:24 PM, Mike Hoye <mh...@mozilla.com> wrote:
My 2006 proposal didn't get any traction either.

https://lists.w3.org/Archives/Public/public-whatwg-archive/2006Jan/0270.html

FWIW I still think it'd be a good idea with the right UI.

- mhoye


On 2017-03-24 1:16 PM, Dave Townsend wrote:
I remember that Gerv was interested in a similar idea many years ago, you
might want to see if he went anywhere with it.

https://blog.gerv.net/2005/03/link_fingerprin_1/


On Fri, Mar 24, 2017 at 10:12 AM, Gregory Szorc <g...@mozilla.com> wrote:

I recently reinstalled Windows 10 on one of my machines. This involved
visiting various web sites and downloading lots of software.

It is pretty common for software publishers to publish hashes or
cryptographic signatures of software so the downloaded software can be
verified. (Often times the download is performed through a CDN, mirroring
network, etc and you may not have full trust in the server operator.)

Unless you know how to practice safe security, you probably don't bother
verifying downloaded files match the signatures authors have provided.
Furthermore, many sites redundantly write documentation for how to verify
the integrity of downloads. This feels sub-optimal.

This got me thinking: why doesn't the user agent get involved to help
provide better download security? What my (not a web standard spec
author)
brain came up with is standardized metadata in the HTML for the download
link (probably an <a>) that defines file integrity information. When the
user agent downloads that file, it automatically verifies file integrity
and fails the download or pops up a big warning box, etc or things don't
check out. In other words, this mechanism would extend the trust anchor
in
the source web site (likely via a trusted x509 cert) to file downloads.
This would provide additional security over (optional) x509 cert
validation
of the download server alone. Having the integrity metadata baked into
the
origin site is important: you can't trust the HTTP response from the
download server because it may be from an untrusted server.

Having such a feature would also improve the web experience. How many
times
have you downloaded a corrupted file? Advanced user agents (like
browsers)
could keep telemetry of how often downloads fail integrity. This could be
used to identify buggy proxies, malicious ISPs rewriting content, etc.

I was curious if this enhancement to the web platform has ever been
considered and/or if it is something Mozilla would consider pushing.

gps
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to