Harden chrome against browser fingerprinting
Project Member Reported by firstname.lastname@example.org, Jul 14 2010
In the recent WebKit Bug https://bugs.webkit.org/show_bug.cgi?id=41801 a discussion about reducing the attack surface against browser fingerprinting has been started. That bug has a list of browser parameters suitable for fingerprinting. We have pondered this topic repeatedly with the MUC team and came to the conclusion that reducing variance in the parameters to an extent that prevents fingerprinting is not possible without severely degrading user experience. This is due to the fact that some measures, such as sorting font and plugin lists actually don't reduce entropy significantly enough. Furthermore, there are more sophisticated fingerprinting techniques, for which we don't know yet a good approach to prevent them. On example is testing the presence of fonts by rendering strings and measuring the resulting DOM elements.
Jul 15 2010,
Adding security@ to cc for comment.
Aug 19 2010,
Oct 20 2010,
Releasing task while I'm on leave.
Jun 28 2011,
May 11 2012,
Aug 13 2012,
Issue 142214 has been merged into this issue.
Feb 7 2013,
What about aggressively hardening in an Incognito window? Users might be more willing to tolerate occasional degradation of the user experience if it's done in a context when they've clearly indicated that privacy is more valuable to them. Similarly, what about hardening different-origin iframes more than a window or tab that was explicitly navigated to? We should carefully examine use-cases, but again, the tradeoffs might be reasonable for users. Finally, what if we turn the tables and track sites/domains that are tracking users? Clues might be: * Checking fonts / plugins without using them * Checking screen size without ever opening a pop-up window or resizing a window We could do this securely by anonymously uploading hashes of domains that might be tracking, correlating that with other users' experience, then warning users of sites that appear to be tracking them.
Feb 7 2013,
Feb 7 2013,
Would it be possible to autoclear a site's cookie(s) upon logout or leaving its domain? Specifically, I just realized Ebay keeps tracking you after you log out, leading to: 1) Your recent search/view history being visible to all despite logging out, and 2) Subsequent searches/views being merged into your account despite *not* being logged in. This tells me that *any* site can track your behavior after logout; most are just smarter/sneakier about it by not blatantly merging the info where you can see it. Note that this is in Incognito, so the Ebay cookies don't even show up in Chrome's "all cookies and site data" manager for manual deletion.
Mar 10 2013,
Sep 19 2013,
Sep 25 2013,
Jan 23 2017,
Issue 682885 has been merged into this issue.
Jan 23 2017,
First, thanks for merging my original report in. I figured there'd been serious thought about this and now I know where it is. A couple of thoughts on the above discussion 1) Publicly announcing tracking resistance is almost certain to backfire unless we really, really nail it, which seems unlikely. Either we invite the rest of the world to pwn the scheme, or someone overly trusts the feature and gets burned, or both. Either would be a major embarrassment. 2) As I see it we're trying to prevent an attacker from tying one browser visit to another without the user explicitly saying, or at least reasonably assuming, that those visits would be tied. If I pay my phone bill online, I don't mind if the phone provider knows that I'm the same person that paid that bill last time. I probably want them to know. If I visit nastyembarrasingstuff.com, I presumably don't want them to be able to tie that to my phone provider. To that end, rather than reduce the amount of information that can be gleaned, it might be useful to reduce how correlatable that information is. I'm not sure this is feasible either, but at least it's a slightly different approach. I mentioned randomizing font/plugin order, but that's defeated by canonicalization. Maybe introducing aliases for fonts, plugins and such, but that smells of security through obscurity. It would be good, at least, to understand what information is _essentially_ unique to a given session -- at a minimum, who's using it --, what's _accidentally_ unique, e.g., clock skew and what's in between (not sure what would be a good designation). Fonts, plugins and such look to be in that area. They're not essential to who you are, but they're useful. Randomizing clock skew in some way would be harmless. Randomly adding and dropping fonts could be disruptive. Randomizing pixel-level spacing in a rendered string might be as well. The middle area is the sour spot: big attack surface, hard to disguise. If it's provably big, as the discussion above suggests it is, then we can go home.
Jan 23 2017,
See https://dev.chromium.org/Home/chromium-security/security-faq#TOC-Why-isn-t-passive-browser-fingerprinting-including-passive-cookies-in-Chrome-s-threat-model- (which refers to this bug), and links to an analysis at https://dev.chromium.org/Home/chromium-security/client-identification-mechanisms
Sign in to add a comment