Detecting the presence of extensions through timing attacks (including Incognito)
Reported by
tomvango...@gmail.com,
Apr 7 2017
|
|||||||||||
Issue description
UserAgent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36
Steps to reproduce the problem:
The Promise returned by fetch('chrome-extension://[PACKAGE ID]/') takes longer to resolve when [PACKAGE ID] is installed vs when it's not installed.
What is the expected behavior?
What went wrong?
The web_accessible_resources property of the extension manifest determines which endpoints of the extension are allowed to be accessed from the web. I presume that when a request to a chrome-extension:// URI is initiated, Chrome will check the manifest and determine whether or not the request should be allowed. Obviously, this can only happen when the extension is installed. This creates a timing difference between requests to installed and requests to not-installed extension endpoints, and thus would allow an attacker to fingerprint a user's installed browser extensions (regardless of the manifest settings) -- which is exactly what the manifest version 2 tried to prevent by using web_accessible_resources.
Probably the worst part about it is that this also works in Incognito mode, even when the installed extensions are not enabled in Incognito mode.
I've attached a PoC that tries to detect the top 100 extensions (note that this PoC is definitely not optimized, and thus may yield some false results). The PoC is also available here: https://poc.tom.vg/extension-fingerprint/
Did this work before? N/A
Chrome version: 56.0.2924.87 Channel: stable
OS Version: OS X 10.9.5
Flash Version:
,
Apr 7 2017
While unrelated from an implementation point-of-view, this issue is behaviorally very similar to Issue 707071 and the mitigations described in the document attached to that issue may be useful here.
,
Apr 7 2017
See also bug 139592 (Extension resources should only be loadable in contexts the extension has permission to access) which might be a potential solution.
,
Apr 7 2017
139592 proposes an interesting limitation, but it wouldn't resolve the timing problem because you still have to load the extension's manifest to parse what origins are permitted access. You'd still need a 707071-style delay to randomize the timing in the event that the extension isn't installed.
,
Apr 7 2017
I don't have access to Issue 707071 , so it could be that this was suggested there already. A possible solution could be to avoid the exit-early scenario when the extension does not exist. There could for instance be a fallback manifest that is evaluated when the requested extension is not loaded (the outcome of this wouldn't actually be used, but the timing should be similar to requests to loaded extensions). Note that if the check of web_accessible_resources is not constant-time, there would still be a timing side-channel (the timing side-channel might be too small to exploit though). When a random timing delay is only added when an extension is not installed, this will produce a certain pattern, which can again be detected because it only exists for non-installed extensions. A deterministic random delay, e.g. based on hash(extension_id || secret) could maybe address that issue.
,
Apr 10 2017
,
Apr 10 2017
+rdevlin for extensions team triage perspective.
,
Apr 11 2017
#5 I can't say too much about Issue 707071 , but it relates to an API call that exposes information about the system by having a significantly shorter code path on one failure case, than another failure case. In such a way that we can't just run the same code paths in all cases. > A possible solution could be to avoid the exit-early scenario when the > extension does not exist. There could for instance be a fallback manifest > that is evaluated when the requested extension is not loaded (the outcome > of this wouldn't actually be used, but the timing should be similar to > requests to loaded extensions). Note that if the check of > web_accessible_resources is not constant-time, there would still be a > timing side-channel (the timing side-channel might be too small to exploit > though). This is very similar to the problem we encountered. As you say, there is still a timing side-channel because the dummy manifest you are going to look at if the extension is not installed is different to the manifest you'll be looking at if the real extension is installed, and you have no way to simulate the actual size of the real one because you don't have it available. The attacker, on the other hand, knows the size of the real one. > A deterministic random delay, e.g. based on hash(extension_id || secret) > could maybe address that issue. Yes, that is exactly the approach we went with. You need to hash both the extension ID (so that attackers can't aggregate across different extensions to figure out which are installed and which aren't) and a per-user secret (so that attackers can't predict the whole thing). +palmer who has been working with me on Issue 707071 .
,
Apr 11 2017
In case of a deterministic random delay, how would you choose the interval for the appropriate values. As users have a wide variety of hardware, the interval for the random delay may still produce significantly different timing results depending on the presence of an extension. Additionally, I could imagine that not every parsing execution takes exactly the same amount of time, so the introduced jitter is something that wouldn't occur (to the same extent) with non-installed extensions. Additionally, if checking whether a resource is allowed per web_accessible_resources is not constant-time, an attacker could craft URIs that should take longer/shorter to process. This behaviour is not exhibited by non-installed extensions that always get the same delay, and hence they can be distinguished. It could be that the timing differences will be too small to reliably exploit when this deterministic random delay is applied, but with the introduction of SharedArrayBuffers it becomes possible to create a timer with nanosecond resolution so that might change things. An alternative timing-irrelevant solution could be to have non-guessable chrome-extension identifiers (e.g. in Firefox these are unique per user).
,
Apr 11 2017
I repro'd this on M57/Linux. I'm calling this Severity-low since it's a moderate data leak. ->rdevlin for triage
,
Apr 12 2017
> In case of a deterministic random delay, how would you choose the interval for the appropriate > values. As users have a wide variety of hardware, the interval for the random delay may still > produce significantly different timing results depending on the presence of an extension. Yes, the random offset needs to be great enough to mask out the noise on even slow hardware. In our case, it was a rarely used API. It might be much worse to do it on every request to an extension resource. > Additionally, if checking whether a resource is allowed per web_accessible_resources is not > constant-time, an attacker could craft URIs that should take longer/shorter to process. That's a problem if you're hashing based on extension ID. I think the correct solution is to hash based on *exactly what the user supplied* -- in this case, the full URL. Then they can't run an experiment testing many different URLs to see whether there's variation or not; it would be bouncing around all over the place. Why do we allow web pages to request extension resources? Is this ever valid? Can we just early-exit the pipeline whenever a chrome-extension:// resource is requested from a web page? Especially in incognito?
,
Apr 12 2017
> Then they can't run an experiment testing many different URLs to see whether there's variation or not; it would be bouncing around all over the place. Do you mean that this bouncing around would only happen for non-existing extensions? In that case, the attacker could analyze the distribution of timings and determine that one is bouncing all over while the other is not. What could be a solution to this, is to determine a deterministic random running time in advance (which should be > WCET), and then in all cases - regardless of the existence of the extension - wait until that running time is reached, that way, the execution time will bounce around in both instances. > Why do we allow web pages to request extension resources? Is this ever valid? Can we just early-exit the pipeline whenever a chrome-extension:// resource is requested from a web page? Especially in incognito? Some websites seem to rely on accessing certain chrome-extension:// resources (e.g. see https://bugs.chromium.org/p/chromium/issues/detail?id=139592#c32)
,
Apr 12 2017
> Do you mean that this bouncing around would only happen for non-existing extensions? > In that case, the attacker could analyze the distribution of timings and determine > that one is bouncing all over while the other is not. No, the random noise would be based on the URL, regardless of whether it's installed. So no matter what the installed state is, requesting different URLs causes the timing to bounce around unpredictably. > regardless of the existence of the extension - wait until that running time is > reached If you are going to wait until the time is reached, you may as well just pick a (large) constant time, instead of random. The reason I'm not in favour of this (and I'm not an expert, this is just the conclusion I came to after thinking about it for ~1 week) is that it means you have to pick a hard number and say "I bet that all requests will take less than X ms". Any requests that do take <X ms will be masked; any requests that take even a bit more than X ms are trivially detectable. By ADDING a random offset instead, you still have to carefully pick your X (the maximum random interval), but the signal-to-noise ratio will gradually break down (the longer the operation takes, the higher the signal-to-noise ratio) but still plausible deniability unless the operation takes significantly longer than X.
,
Apr 12 2017
A non-installed extension would have following timing: a + [0,X] An installed extension has the following timing: a + b + [0,X] a = time to make the request (which is the same for both), b = time to parse the manifest and match the requested resource I've attached a Python script that runs some rudimentary estimations on the impact of the accuracy for the attacker in terms of the size of X. Example of output: Attacker accuracy with X = b*1: 100.00% Attacker accuracy with X = b*6: 100.00% Attacker accuracy with X = b*11: 100.00% Attacker accuracy with X = b*16: 99.60% Attacker accuracy with X = b*21: 97.78% Attacker accuracy with X = b*26: 94.96% Attacker accuracy with X = b*31: 91.10% Attacker accuracy with X = b*36: 88.56% Attacker accuracy with X = b*41: 85.08% Attacker accuracy with X = b*46: 82.28% Attacker accuracy with X = b*51: 79.70% With 500 observations by the attacker (the same amount as I used in my PoC), an accuracy of 90+% is achieved for X < 32*b. Of course with more measurements, this accuracy grows (e.g. with 1000 measurements, 90+% accuracy is achieved for X < 46*b). Of course this estimation is under circumstances that are very beneficial to the attacker, as in real-world systems there would be "natural" noise introduced. But I think it does show that adding random padding requires a very high value of X (i.e. magnitudes of b) to be efficient. Since in this case b is quite small (on my machine, the whole fetch() operation takes 0.1ms on average); I think that a constant time of e.g. a+10*b (~1ms -- or maybe even higher, to accommodate slower systems) provides a better guarantee than a random time between [0,20*b] (which on average has the same overhead). Constant time removes the ability to gain any information, random time just reduces the accuracy, which can be overcome by getting more measurements.
,
Apr 12 2017
#14: Thanks for going into all of the maths. I think your script makes the assumption that it's random every time; therefore with 500 observations an attacker can average out the randomness to see what is really going on underneath. (That's *I think* essentially what your script simulates.) But you yourself suggested a deterministic hash of the extension; therefore the calls to random.random() in your script are replaced with an unknown but deterministic constant. The only difference the attacker will see between runs will be normal background noise; averaging those out won't help understand whether it's installed or not. HOWEVER, I did realise a problem with the URL thing (and maybe that's what you're getting at). If you hash the extension ID alone (not the full URL requested), then as you said above there are subtle timing differences for different URLs within the extension. However, if you hash the full URL, then the attacker is able to request 500 different URLs from the same extension, all of which will have a different multiplier of X, but which reflect the same installed state for that extension. Therefore you certainly are able to average the runs out to see whether it's installed. So I'm not sure what the correct answer is here; it's a question for the extensions team.
,
Apr 12 2017
,
Apr 12 2017
Yes, that's indeed what I was getting at. If the attacker can obtain multiple random padding values for a single extension, he can average out this padding (e.g. when the padding is based on the URL). I think the deterministic random padding based on the extension ID can work if checking whether an extension resource is accessible is done in constant time (or at least to an extent that the timing difference between different URLs for the same extension can not be distinguished from the background noise).
,
Apr 12 2017
FYI this research paper has a more thorough evaluation of the deterministic random padding (see section 4.3): http://cosade.cased.de/files/2011/cosade2011_talk12_paper.pdf
,
Jul 7 2017
,
Jul 11 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/0e41fe8632d60a94200666e5eae0660bfdcd33e9 commit 0e41fe8632d60a94200666e5eae0660bfdcd33e9 Author: rdevlin.cronin <rdevlin.cronin@chromium.org> Date: Tue Jul 11 02:13:15 2017 [Extensions] Change renderer-side web accessible resource determination The web accessible resource determination in the renderer has an early-out for non-existent extensions. Unfortunately, this makes it possible to determine whether a user has a given extension installed through timing attacks - it takes longer to make a decision about whether to allow a resource to load when the extension is installed than when it isn't. This can be surprisingly accurate. Rejigger the web accessible resource determination in a few ways. Most importantly, keep a set of loaded extensions that have any accessible resources, and check this (rather than the full extension set) for whether to potentially allow the load. This way, an extension with no accessible resources and an uninstalled extension should take the same amount of time to reach a decision, which is the desired outcome. BUG= 709464 BUG= 611420 Review-Url: https://codereview.chromium.org/2958343002 Cr-Commit-Position: refs/heads/master@{#485494} [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/chrome/renderer/extensions/chrome_extensions_renderer_client.cc [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/chrome/renderer/extensions/chrome_extensions_renderer_client.h [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/chrome/renderer/extensions/resource_request_policy.cc [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/chrome/renderer/extensions/resource_request_policy.h [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/extensions/common/manifest_handlers/webview_info.cc [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/extensions/common/manifest_handlers/webview_info.h [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/extensions/renderer/dispatcher.cc [modify] https://crrev.com/0e41fe8632d60a94200666e5eae0660bfdcd33e9/extensions/renderer/extensions_renderer_client.h
,
Aug 4 2017
This particular bug should be fixed.
,
Aug 5 2017
,
Aug 8 2017
,
Aug 14 2017
Many thanks for the report, but I'm afraid the VRP panel declined to award for this bug.
,
Nov 11 2017
This bug has been closed for more than 14 weeks. Removing security view restrictions. For more details visit https://www.chromium.org/issue-tracking/autotriage - Your friendly Sheriffbot |
|||||||||||
►
Sign in to add a comment |
|||||||||||
Comment 1 by a...@chromium.org
, Apr 7 2017Labels: OS-Chrome OS-Linux OS-Windows