New issue
Advanced search Search tips

Issue 49075 link

Starred by 23 users

Issue metadata

Status: WontFix
Owner: ----
Closed: Sep 2013
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: All
Pri: 2
Type: Feature



Sign in to add a comment

Harden chrome against browser fingerprinting

Project Member Reported by mnissler@chromium.org, Jul 14 2010

Issue description

In the recent WebKit Bug https://bugs.webkit.org/show_bug.cgi?id=41801 a discussion about reducing the attack surface against browser fingerprinting has been started. That bug has a list of browser parameters suitable for fingerprinting.

We have pondered this topic repeatedly with the MUC team and came to the conclusion that reducing variance in the parameters to an extent that prevents fingerprinting is not possible without severely degrading user experience. This is due to the fact that some measures, such as sorting font and plugin lists actually don't reduce entropy significantly enough. Furthermore, there are more sophisticated fingerprinting techniques, for which we don't know yet a good approach to prevent them. On example is testing the presence of fonts by rendering strings and measuring the resulting DOM elements.
 
Adding security@ to cc for comment.
Status: Assigned

Comment 3 by pam@chromium.org, Oct 20 2010

Status: Available
Releasing task while I'm on leave.

Comment 4 by mal@google.com, Jun 28 2011

Cc: security-bug-mail@chromium.org

Comment 5 by Deleted ...@, May 11 2012

I don't think we should just punt on this -- I (like a few hundred million other people) am increasingly worried about what is happening to my data -- and as the masses take more steps to reduce the easier ways to track, the trackers will turn increasingly to fingerprinting.

What if we followed an opt-in model, where by default information like plugin-lists and font lists was not sent in headers and not available to javascript code?  (We could make the presence or the absence of a few most-popular plugins and fonts available without making fingerprints too unique.)  Then, if a website told the user, "We can enhance your experience if you give us access to such and such," the user could make a decision to enable that data just for that website.
 Issue 142214  has been merged into this issue.
What about aggressively hardening in an Incognito window? Users might be more willing to tolerate occasional degradation of the user experience if it's done in a context when they've clearly indicated that privacy is more valuable to them.

Similarly, what about hardening different-origin iframes more than a window or tab that was explicitly navigated to? We should carefully examine use-cases, but again, the tradeoffs might be reasonable for users.

Finally, what if we turn the tables and track sites/domains that are tracking users? Clues might be:

* Checking fonts / plugins without using them
* Checking screen size without ever opening a pop-up window or resizing a window

We could do this securely by anonymously uploading hashes of domains that might be tracking, correlating that with other users' experience, then warning users of sites that appear to be tracking them.

What, specifically, is the threat model? Who are the attackers, what do they gain, how do they do it, why do they fingerprint instead of using more straightforward means, and at what point can we say we have mitigated the risk enough?

I advise against overloading the meaning of Incognito.

I am pretty sure that it is not possible to get the number of bits of information that fingerprinters can glean from the browser significantly down below 33, without massively degrading the browser's functionality (and thus identifying the user as one of the very few who are willing to use such a feature-free browser).

Personally, I would mark this bug as WontFix. I would like for this bug to be fixable, but I simply don't think it is.

Consider that Zander and Murdoch were able to de-anonymize Tor hidden services by clock skew alone (http://www.cl.cam.ac.uk/~sjm217/papers/usenix08clockskew.pdf; and JavaScript now has a directly-accessible high-resolution timer feature: http://www.w3.org/TR/hr-time/#sec-privacy-security), and that Eckersley implemented Panopticlick in less than a week and is not a professional front-end engineer (i.e. there is probably way more low-hanging fruit available than he used; never mind higher-hanging fruit).

Obviously, I am all for privacy and security, but only when the goals are precisely specified and achievable.

Comment 9 by mkte...@gmail.com, Feb 7 2013

Would it be possible to autoclear a site's cookie(s) upon logout or leaving its domain?  Specifically, I just realized Ebay keeps tracking you after you log out, leading to:
1) Your recent search/view history being visible to all despite logging out, and
2) Subsequent searches/views being merged into your account despite *not* being logged in.

This tells me that *any* site can track your behavior after logout; most are just smarter/sneakier about it by not blatantly merging the info where you can see it.

Note that this is in Incognito, so the Ebay cookies don't even show up in Chrome's "all cookies and site data" manager for manual deletion.
Project Member

Comment 10 by bugdroid1@chromium.org, Mar 10 2013

Labels: -Area-Internals -Feature-Privacy Cr-Internals Cr-Privacy
I agree that the threat model is really hard to quantify. That's because the worst offenders aren't publicly sharing their data, and that situation isn't likely to change. Moreover, the social importance of personal privacy is even harder to quantify, but that doesn't mean it isn't vitally important to the development of civilization as a whole. Academic, heavily data-driven sociology is just now catching up on this topic.

The worst case scenarios are future abuses on a grand scale. When Nixon wrote in his diary about using surreptitiously gathered information to secure the "reins of power", he was writing in the future tense. Demanding hard, proven, verifiable wrongdoing, and asking for quantified proof that some particular Chromium patch will definitively prevent that wrongdoing is setting the bar impossibly high. While it's understandable from within an institutional, systemic process, it's also a recipe for defeatism, paralysis, and maybe even social catastrophe. People (especially programmers) act on generalized, informed hunches all the time, and the world is mostly a better place for it.

I would like to hear more discussion of the "harden in incognito" idea. I don't necessarily agree that's it's "overloading" the incognito concept to have different header responses. What are the core functionalities that would need to be axed to get below 33? Can you be specific? What about enabling just a core set of plugins and fonts in incognito, standardized across-the-board? What about suppressing plugin micro version strings (really only for debugging)?  What about suppressing the high resolution javascript timer in incognito? (We lived without it before.) Lastly, what about suppressing user agent version substrings across the board? Also, anonymity would obviously increase as incognito usage increases. To put it utterly crassly, given the current environment, wouldn't it be great PR for Chrome to proudly announce that a revamped Incognito defeated the more obvious forms of fingerprinting? Yes, there will be tracking zealots who find ways to circumvent it--and those can be addressed in turn next time around…

Any quantitative fingerprinting goal will be arbitrary on some level. That doesn't mean there shouldn't be a goalpost. It's not all-or-nothing. Please re-assess on this.
Status: WontFix
"Any quantitative fingerprinting goal will be arbitrary on some level."

No, the 33-bits thing comes directly from the population of the planet. All existing work shows that it is *very, very easy* to get at least 33 bits of information passively; I bet you could even do it without JavaScript, urely by passively monitoring network traffic. And if you did all the things necessary to reduce the power of fingerprinting, you'd have joined a very tiny population — putting you back at square 1. (See e.g. http://freehaven.net/anonbib/cache/usability:weis2006.pdf)

I wish passive browser fingerprinting were a solveable problem, but all the evidence suggests that it isn't. Again, Eckersley's Panopticlick was a bare-minimum effort by a non-JavaScript- and non-browser-specialist, just to demonstrate how easy it is.

Incognito is a "no local state stored permananently" mode, *not* a Do Not Track mode. (See http://www.chromium.org/Home/chromium-security/security-faq#TOC-What-are-the-security-and-privacy-guarantees-of-Incognito-mode-)

Incognito is not a Dot Not Passively Track mode.

Making security mechanisms harder to understand and use (e.g. "Incognito has been ugpraded! Now it is both a 'no local state stored permananently' mode, and it also makes a hard-to-quantify attempt to solve what is probably not a solveable problem! To acheive this, HTML 5 audio and video are not available in this mode...") makes them weaker security mechanisms. (See Kerckhoffs' 6th: http://en.wikipedia.org/wiki/Kerckhoffs's_principle)

WontFix because CantFix. I'm sorry. :(
Issue 682885 has been merged into this issue.

Comment 14 by dmh@google.com, Jan 23 2017

First, thanks for merging my original report in.  I figured there'd been serious thought about this and now I know where it is.

A couple of thoughts on the above discussion

1) Publicly announcing tracking resistance is almost certain to backfire unless we really, really nail it, which seems unlikely.  Either we invite the rest of the world to pwn the scheme, or someone overly trusts the feature and gets burned, or both.  Either would be a major embarrassment.
2) As I see it we're trying to prevent an attacker from tying one browser visit to another without the user explicitly saying, or at least reasonably assuming, that those visits would be tied.  If I pay my phone bill online, I don't mind if the phone provider knows that I'm the same person that paid that bill last time.  I probably want them to know.  If I visit nastyembarrasingstuff.com, I presumably don't want them to be able to tie that to my phone provider.

To that end, rather than reduce the amount of information that can be gleaned, it might be useful to reduce how correlatable that information is.  I'm not sure this is feasible either, but at least it's a slightly different approach.  I mentioned randomizing font/plugin order, but that's defeated by canonicalization.  Maybe introducing aliases for fonts, plugins and such, but that smells of security through obscurity.

It would be good, at least, to understand what information is _essentially_ unique to a given session -- at a minimum, who's using it --, what's _accidentally_ unique, e.g., clock skew and what's in between (not sure what would be a good designation).  Fonts, plugins and such look to be in that area.  They're not essential to who you are, but they're useful.  Randomizing clock skew in some way would be harmless.  Randomly adding and dropping fonts could be disruptive.  Randomizing pixel-level spacing in a rendered string might be as well.

The middle area is the sour spot: big attack surface, hard to disguise.  If it's provably big, as the discussion above suggests it is, then we can go home.

Sign in to add a comment