New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.
Starred by 130 users

Issue metadata

Status: WontFix
Owner: ----
Closed: Jan 2011
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: ----
Pri: 2
Type: Bug

Restricted
  • Only users with Commit permission may comment.



Sign in to add a comment

xsl stylesheet wrongly blocked, all on local drive

Reported by akh...@gmail.com, Jan 19 2011

Issue description

Chrome Version       : 8.0.552.237
URLs (if applicable) : any, typically a local directory and file with xslt stylesheet in same directory
Other browsers tested:
Add OK or FAIL after other browsers where you have tested this issue:
Safari 5:
  Firefox 3.x: works fine
       IE 7/8: works fine

What steps will reproduce the problem?
1.load an xml file with xsl stylesheet, both from same directory, local drive

2.use PI relative (or absolute) to same directory as in
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet href="stratml.xsl" type="text/xsl" ?>

3.to get a blank page. Then check console to get report as below

What is the expected result? diplayed html page


What happens instead? blank page with chrome refusal to display, as reported by console below


Please provide any additional information below. Attach a screenshot if possible.

The Chrome console reports as follow:

Unsafe attempt to load URL file:///C:/stratml.xsl from frame with URL file:///C:/UT.xml. Domains, protocols and ports must match.
 

Comment 1 by wtc@chromium.org, Jan 22 2011

Labels: -Area-Undefined Area-WebKit WebKit-Core

Comment 2 by abarth@chromium.org, Jan 22 2011

Status: WontFix
We've got security for file URLs locked down pretty tight.  There are some bugs on file about loosening up our restrictions if you'd like to star them.

Comment 3 by akh...@gmail.com, Jan 22 2011

I am not sure what you mean by "staring them".  It seems to me, that, as reported by the console, that if the page path and the resource path ar identical, that it is not easy to justify reporting that domains, protocols, and ports should match, as they obviously do.  How much more matching should they be?  

Why does status say "wontFix"?

I will be happy to help if I can, but that status seems to lack courrage and imagination ...

Regards,
ac

Comment 4 by abarth@chromium.org, Jan 22 2011

Staring a bug is a way of expressing that you're interested in having the bug fixed.  In this case, the message in the console is meant for normal (non-file) URLs.  We could change the message to be more accurate.

The status says WontFix because we're made a conscious decision to lock down the security of HTML files in the local file system and this behavior is a natural consequence of that decision.  There's no lack of courage or imagination.  Just a prioritization of security over functionality in this case.

Comment 5 by akh...@gmail.com, Jan 22 2011

OK, fine but could you tell me how a local page accessing a resource from the exact same path and directory poses a security issue differently than if simply loading the file or running a program on my system, or executing the same file and script from the Internet?  If I put the xsl in the same directory as the page that references it, on my drive, why can I not view the file? It certainly seems a lot safer than a lot of other things that I can do on the web. I even wrote the xsl but if I wrote it bad, I don't need the browser to protect me against myself. And I can stil run it on a server if I really want to hurt myself ...  It seems like misplaced security.  I am glad that you are security concious, but overdoing it won't help.  For one, I use another browser.  Please clarify and explain your logic.

Thank you

Regards,
ac

Comment 6 by abarth@chromium.org, Jan 22 2011

It's a matter of perspective.  For example, what if the directory in question is your Downloads folder, into which you've downloaded your 2009 tax form in XML as well as an HTML email attachment?

In any case, please feel encouraged to use a browser that makes you happy.  One of the great things about there being a bunch of awesome browsers to choose from is that you can choose the one that makes trade-offs you like.  In this case, we're making a trade-off in favor of security to the detriment of some functionality.  As I said above, we're considering loosening these restrictions, but that's a topic for those other bugs.

Comment 7 by igi...@gmail.com, Apr 19 2011

Staring != Starring

Comment 8 by slavi...@gmail.com, Apr 19 2011

I do not think, that there going to be any security issues, if you let browser access files on local system. In your example above (with tax form in XML in same folder) - there is no security issue, because for JavaScript to access that file - it need to know its filename ahead of time. How likely is that? I think it is completely overdoing...

Comment 9 by gav.ai...@gmail.com, Apr 23 2011

I understand you're making a security trade off but it would be appreciated if we could add exceptions to the rules. 

Specifically so that I can test GWT applications in Production/Web mode. If you're not going to provision this then please ask the GWT document writers to add a bit warning that Firefox MUST be used, not just that it's the default.

http://code.google.com/webtoolkit/doc/1.6/tutorial/compile.html#run

 
You can disable this security feature using the --allow-file-access-from-files option command line flag.
How do I start a new chromium session with --allow-file-access-from-files enabled?  Running "chromium-browser --allow-file-access-from-files" is not sufficient, as this merely opens a new window in my current session and ignores the flag.

I see this, and related issues, has been going for over a year now, and there's little or no effort to fix it.  The security policy is reasonable, but the methods to bypass it suck.  Having to open a terminal to open Chromium is a bother.  That I have to close my current Chromium session is completely unacceptable.  In other instances Chromium is a fantastic browser for development, so why is provision for this so terrible?

In short, why is there not a Developer mode, analogous to Incognito mode, with its own policies?

Comment 12 by krtul...@gmail.com, May 18 2011

@11, you can start a brand new Chrome/Chromium session with:
        "C:\path\to\chrome.exe" --user-data-dir=C:\new\directory --allow-file-access-from-files

Comment 13 by k...@google.com, May 19 2011

 Issue 39616  has been merged into this issue.

Comment 14 by Deleted ...@, Jun 10 2011

Regarding Comment 6:

Your team's decision has caused me major pain. It's forcing me to trash my frames-based navigation system because now, under Chrome, users who extract HTML collections on their local system can longer auto-expand the topic location in the nav pane. Maybe my topic viewer app deserves to be trashed, but I'd rather make that decision and not have it forced on me. From my perspective your team has broken something that was working.

Telling people to use a different browser is not helpful since my users are making the browser decision. 

Comment 15 by Deleted ...@, Aug 18 2011

Is there a way to give permissions for a specific domain to run js code into the parent ? An xml file ? I encountered once an xml file (crossdomain.xml) or something like this that gives permissions to cross domain ajax requests. Is there something similar for iFrames ?
I also appreciate the regard for security, but at the very least users should be able to white-list certain paths or be prompted with a little drop-down (like the one to save passwords) to allow/deny file access from files.

Comment 17 by zare...@gmail.com, Sep 15 2011

Why is Chrome the only major browser on the market who cannot use XSLT for XML files located in the same directory? I don't see a security issue here - even when the taxes document or emails are found and opened, the attacker cannot send any data anywhere, because it would have to leave the local file:/// protocol.

Tested on Win 7:
          IE 9: works fine
   Opera 11.51: works fine
 Firefox 5.0.1: works fine
 Firefox 6.0.2: works fine
    Safari 5.1: works fine
     Chrome 13: FAILS!
> the attacker cannot send any data anywhere, because it would have to leave the local file:/// protocol.

It's very easy to send the data outside the file protocol.  For example, you can embed an image element that makes a network request.
> For example, you can embed an image element that makes a network request.

Then make the default behavior to deny cross-protocol access (i.e. file:// -> http://), and *give the user the option* to allow it on a case-by-case basis--without having to start a new session from command line.  Note the emphasis on the "give the user the option" part.

Comment 20 by azar...@gmail.com, Sep 15 2011

> It's very easy to send the data outside the file protocol.  For example, you can embed an image element that makes a network request.

That would require leaving the file://// protocol.  If one were to completely disable all access to http:// from file://, both read and write, everything would be fine for most of the cases where this bug is an issue.  If the page sees the file system as read-only (which should usually be the case) then I don't see what harm it could possibly do.
Because I use Chrome as my primary browser for surfing and development, I managed to waste quite a bit of time with this deficiency. At a minimum, the browser should provide a clearer and more immediately obvious result when triggering this "feature."

However, I second comment 20. Please provide a rationale here for why that is not an acceptable change to the current implementation. In the rationale, please provide a use case that would yield an actual security problem.

Failing that, please humor me and explain why HTML can fetch style sheets, images, and external JavaScript and then evaluate it, all using the file protocol while an XML file can not read an XSL style sheet for security reasons. It may be obvious, but it sure isn't coming to me right now.
XML has different security properties than CSS or JavaScript.  In the web security model, CSS and JavaScript are publicly executable, but XML is private.  For example, you can run a JavaScript library from another web site, but you can't read their XML.  Unfortunately, XSLT has a security flaw where applying an XSLT to your document allows you to read the contents of the XML that makes up the XSLT.  That means a malicious document can apply an XSLT (e.g., an file saved by your word processor to your local file system) and steal information from it.

Given his situation, we're faced with a trade-off.  We can allow local XSLT, which is a feature that some fraction of the audience will enjoy, but doing so opens a security hole.  In this case, we've weighed the alternatives and decided that the security of the many out weighs supporting this feature for the few.

Reasonable people might disagree about whether that's the right trade-off to make.  That's why there's a diverse set of browsers you can choose from, each of which makes different decisions and trade-offs as to what they think is best for users.  We generally come down on the side of security because security is one of our core values, along with speed, simplicity, and stability:

http://dev.chromium.org/developers/core-principles

Supporting every feature in every environment isn't one of our core principles.  I'm sorry if that means Chrome isn't the best browser for you.

Comment 23 by azar...@gmail.com, Sep 25 2011

There should be no trade-off here - there was just an odd decision made at some point.

Allowing pages executing from the local hard drive to access the internet is a security hole.  Instead of closing down this security hole, it seems a decision was made to mitigate its severity by only allowing access to certain file types which were assumed to never contain private information.  The assumptions under which this makes any sense would only be reasonable to someone primarily developing web apps which loaded their content from web services (like most Google engineers, I suppose).

I'm also very confused by comment 22's assertion that web server file access permissions are usually different for XML files than they are for other kinds of files.  How are pages supposed to access xml data if they're not accessible?  Or is there a hidden [false] assumption that the xml is always being processed by server-side cgi?
> Allowing pages executing from the local hard drive to access the internet is a security hole.

I agree.  Unfortunately, forbidding local documents from accessing the Internet would be a more painful restriction than blocking XSLT.  To make that secure, for example, we'd need to prevent hyperlinks from local documents to the network because an attacker can encode data in those hyperlinks and that data is sent over the Internet.

> How are pages supposed to access xml data if they're not accessible?

On the web, a document can access XML that's in the same origin.  Unfortunately, the concept of "same-origin" breaks down for the local file system, which is the root of the problem here.  It's difficult to determine when two files in your local file system ought to be able to read each other.  Many heuristics, for example that the two files are in the same directory, break down in common scenarios (e.g., a Downloads directory that contains both untrusted and sensitive files).

I appreciate the feedback, but believe me that we've been around this block a number of times.  :)
Adam,

> Unfortunately, XSLT has a security flaw where applying an XSLT to your document allows you to read the contents of the XML that makes up the XSLT. 

Through document("")...

> That means a malicious document can apply an XSLT (e.g., an file saved by your word processor to your local file system) and steal information from it.

You lost me here. Can you elaborate?
> You lost me here. Can you elaborate?

A document can learn information about the XSLT that was applied to the document.  Assume there's a secret XML document in your local file system.  If a malicious document on your local filesystem can load the secret document as an XSLT, then it can learn about the secret document, potentially causing a security problem.
> A document can learn information about the XSLT that was applied to the document.

An XSLT can *produce* a document that combines information from both the source document and the XSLT.

> Assume there's a secret XML document in your local file system.  If a malicious document on your local filesystem can load the secret document as an XSLT, then it can learn about the secret document, potentially causing a security problem.

I don't see how it can do that using the XML stylesheet PI. If what you're referring to is script-driven XSLT, how is this different from XHR being able to read local files?


> How is this different from XHR being able to read local files?

We block that too.
So where's the risk with XSLTs invoked through the xml-stylesheet PI? Still trying to grasp that.
The risk is very simple:

1) XML document are confidential.
2) Including an XSLT allows a document to extract information from the XSLT document.
3) The XSLT document consists of XML.
4) Therefore, including a XSLT allows a document to extract confidential information.
5) We don't wish to disclosure confidential information from one local file to another.
6) Therefore, we must prevent a local document from including an XSLT in the local file system.
> The risk is very simple:

If it was simple it should be easy to explain :-)

> 1) XML document are confidential.

OK

> 2) Including an XSLT allows a document to extract information from the XSLT document.

If the XML document is confidential then why does it contain an XML stylesheet PI referencing a potentially dangerous XSLT?

> 3) The XSLT document consists of XML.
> 4) Therefore, including a XSLT allows a document to extract confidential information.

You lost me again. Can you provide an example?


> If the XML document is confidential then why does it contain an XML stylesheet PI referencing a potentially dangerous XSLT?

The XSLT is the confidential entity, not the document with the PI.  The document with the PI is the malicious entity.
> The XSLT is the confidential entity, not the document with the PI.  The document with the PI is the malicious entity.

OK, so how do you get the malicious entry in the filesystem?

How likely is it that somebody stores a confidential XML document with extension ".xslt"?

And what malicious effect can you reach by "applying" that XSLT to the XML, besides having it display in the browser window?
> OK, so how do you get the malicious entry in the filesystem?

One common way is to send the user the malicious entity as an email attachment.  Anyway, I've answered your questions.
> One common way is to send the user the malicious entity as an email attachment.  Anyway, I've answered your questions.

No, you did not:

1) How likely is it that somebody stores a confidential XML document with extension ".xslt"?

2) And what malicious effect can you reach by "applying" that XSLT to the XML, besides having it display in the browser window?

If you can't explain what the problem is, this question will come up again and again.


> 1) How likely is it that somebody stores a confidential XML document with extension ".xslt"?

The file extension isn't really relevant.  It might be the case that we can trade-off some security here for more features, but it's easier to reason about the world if we maintain the invariant that XML is treated as confidential.

> 2) And what malicious effect can you reach by "applying" that XSLT to the XML, besides having it display in the browser window?

Information is leaked from the XSLT to the document.  We can't prove a bound on what is leaked, so we are conservative and forbid it.

If you'd like to create a browser that makes different trade-offs that you think folks will like better, please go ahead.  The source is all available for you.  As I wrote above, we've studied this question carefully and made a trade off here according to our core beliefs.
O.K. I get the issue. I also realize that I can avoid the problem by using a development server rather than simply reading from the file system. However, I have a few suggestions before abandoning all hope and passing through the gate:

1. I would like to recommend a development mode that lifts this restriction. Perhaps it could be attached to having the "Developer Tools" installed and open, but with an explicit button or other interactive element to enable it.

2. Even if you make no other change, provide an explicit pop-up or some other unmistakable indication that any file system based XSL file referenced through an XML processing instruction is explicitly not read or processed for security reasons. It would also be helpful if the warning states that there is no means to disable the security block if you still will not provide one. Currently, the behavior is not obvious and a fair amount of time can be wasted by the user while trying to figure out what is wrong and whether or not there is a work-around.

3. Stop harping on how there are other alternatives. We all know that and use them. The problem is not that the other browsers do not have this restriction. The problem is that we need to verify that XML and XSL processing for a given set of files works correctly in the Chrome browser. The complaints on this issue are a result of discovering a somewhat surprising reason that Chrome was failing to work as expected and having done so after going through a frustrating experience to first find the error report and then using a search engine to learn what the actual problem is. There is a defect here even if it is nothing more than poor reporting of the reasons XSL processing fails when using the "file" protocol.
There isn't a message in the developer console about the problem?  You can also enable the feature by passing the --allow-file-access-from-files command line argument.
Yes, a message appears in the developer console:

Unsafe attempt to load URL file://.../test.xsl from frame with URL file://.../test.xml. Domains, protocols and ports must match.

However, it takes some web searching for a number of us to figure out what this means. First, the "unsafe attempt" is unclear though you have covered it fairly well here. Then there is the bit about the frame. Not really all that important or confusing, but still. I am loading an XML document into a blank web page. "Window" or "page" are a little clearer in this case than "frame." But the really unclear part is the second sentence because "domains, protocols, and ports" do match. With the second sentence completely off-topic and the initial "unsafe attempt" referring to a non-obvious security default unique to Chrome, this is not a useful error message. And, of course, it is the only indication of what went wrong. The main window of the browser is completely silent about ignoring the XSL file and renders a completely blank page for a test.xml file containing:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="test.xsl"?>
<html>
  <head>
    <title>Test</title>
  </head>
  <body>
    <div>Original.</div>
  </body>
</html>

Now, no problem doing "View Page Source" to see that the file was read and to then open the "Developer Tools" to see the message, but both of those lead to the web search and the episodic conversations on this issue page once it is found.

Look, I now know what the problem is and how to deal with it, but the current implementation for reporting the problem is not going to help the next developer who comes along. So, at a minimum, try to provide a better warning in the Developer Tools console. I think it would also be handy if there was an indicator for HTML, JavaScript, and XML errors on the main browser window frame that would indicate that a problem with a page was detected whether the Developer Tools are open or not. Of course, this should be a developer setting that is off by default. Also, I am not asking for any additional error checking, just an indicator of some fault that would be reported in the console if it were open or visible.

The command line flag is a useful thing to know and I may end up using it some day. However, I do most of my development on a Mac. To pass in the command line argument either requires passing it in with the "open" command to launch Chrome or I need to invoke Chrome by using a path to some executable in the "Google Chrome.app" directory structure or there may be a property list file or some other file I can edit to pass the flag. I am sure it isn't that difficult and that I can figure it out when I need it--I have been programming computers running a wide variety of operating systems and other software for over 35 years now. However, it is a bit of a pain to set and firing up a local web server to serve my files for development will probably be easier than shutting down the web browser that I have open almost constantly and relaunching it with whatever I find on the Internet about how to pass this flag to Chrome. So, a long winded way to say that the function enabled by the command line flag is useful and appreciated, but that adding a checkbox for it somewhere in Developer Tools or Preferences would be more useful. Not strictly necessary, but it would be appreciated nonetheless.

By the way, is there some way to select and copy entries from the console? I was not able to do so and ended up manually entering the error report above.
> The file extension isn't really relevant.  It might be the case that we can trade-off some security here for more features, but it's easier to reason about the world if we maintain the invariant that XML is treated as confidential.

OK, so you don't know :-).

> Information is leaked from the XSLT to the document.  We can't prove a bound on what is leaked, so we are conservative and forbid it.

What does "leaking to the document" mean here? Example please.

> If you'd like to create a browser that makes different trade-offs that you think folks will like better, please go ahead.  The source is all available for you.  As I wrote above, we've studied this question carefully and made a trade off here according to our core beliefs.

We all know that people who need this feature can simply use a different browser.

The interesting question is whether there's really a security risk.

The answer here is so simple: GIVE USERS THE OPTION.  Blanket policy decisions like this are rarely the best way to approach a solution.  I understand how your "core beliefs" guided you to make the trade-offs you did, but you can present no philosophical argument to justify taking away a user's ability to choose--at least on a case-by-case basis--whether or not they want to make the same trade-offs. (If your reason's are monetary, technical or otherwise, i.e. time/cost/effort prohibitive, then just say so.)  If a user wants to disable the setting for a specific set of content that they *know* to be safe (e.g. online help stored locally for offline access) then that should be their choice.  Even if it's not safe, it's a decision they've made.  Show a big scary warning/disclaimer if you want, but make it an option.  Am I missing something obvious here?

Comment 42 by ad...@zrusit.eu, Dec 7 2011

People, this bug affects thousand and thousand of people using phpBB mods. Kindly see http://www.phpbb.com/bugs/modteamtools/62887 - and a blank page and some cryptic message hiiden in developer (invisible by default) really does not cut it. 

Nor do I get the "simple" security risk that the developers totally failed to explain here. As already said, "If it was simple it should be easy to explain :-)"

Comment 43 by nexta...@gmail.com, Dec 14 2011

Why this problems only  in chrome? ie9 and ff did not have any issue/

Comment 44 Deleted

Comment 45 by yincr...@gmail.com, Feb 10 2012

The developer accurately explained it. 

Two files: your confidential XML file and the malicious XML file.

Malicious XML file includes the confidential XML file as an XML Stylesheet (XML stylesheets are just XML files). The malicious file then has access to the data in the confidential XML file. It can access them via embedded XSL on the malicious document and then get the browser to make a network request like requesting for an image to send confidential data.


Please could a proponent of the current security policy please provide a concrete test case demonstrating the security risk here?  I.e., provide a set of example XML/XSL files that I could test on, say, Firefox, and see for myself that an untrusted file can manipulate things it's not meant to.

Comment 47 by Deleted ...@, Feb 24 2012

Really, this is just developer(s) with a stubborn unwillingness to serve users and who respond with an arrogant, almost agressive disdain (e.g., suggesting if you don't like it, write your own browser).  This issue is a significant problem for many users and developers that could be addressed with a user option to override any security risks. (Is there even one documented security problem related to this issue in browsers that allow what Chrome doesn't?)  If aba worked for my company, he'd be fired or removed from customer contact if he suggested that the solutions are to use or create a competitive product.  What a twit.

Comment 48 by tga...@gmail.com, Feb 24 2012

With all due respect, I agree with Comment # 47.
Take a look at Issues  47416  and  93865 . They seem to cover the root problem.
This "security" feature reminds me of IE Enhanced Security Configuration [1] - total bullcrap unfit even for browsing local documentation, something that every single person out there turns off immediately after install on those affected server configurations. At least you can turn it off there - unlike in Chrome. :(

[1] http://technet.microsoft.com/en-us/library/dd883248(v=ws.10).aspx

Comment 51 by Deleted ...@, Feb 28 2012

It is quite clear that Google will do nothing to correct this issue. They are apparently keen on making a (mute) point about the fact that they are more security conscious than the people at Safari and Mozilla...

Our products are thus unusable with Chrome (local documentation in XML with browser-based XSLT transformations: It works with every other browser...). We'll make sure our users are aware of this and follow the hint suggested by a few maintainers here: use another browser. 

For my part I've had it with Chrome.

I can't believe the status is 'WontFix'. If it is a security risk user can be prompted and allow/deny the process. This is a common thing, risky perhaps but already done also for a more important matter: SSL Certificates. Self - signed certificates make appear a warning page to users, asking to explicitly asking for trust. And that is a more,more,more important matter.
Users should be definitely let free to chose also in this case.
A long lng time ago I also found this issue and still following this nice conversation. The most strange thing is: If I load stilesheeted XML from web-server it works fine. If stilesheet is located on a local computer I get empty page. :) And theese guys talking us about security risks! Somebody, please, explain me the difference. Why they allow one thing and disable theother one?

Comment 54 by r...@cornell.edu, Jun 21 2012

Same issue. When something as innocuous as 
document.getElementById("menu").contentDocument.body.offsetHeight
throws a security error when running from disk, it's evidence of a poor programming decision.
As far as I can see the noted security issue comes from assumption that
arbitrary XML file can be treated as XSL stylesheet with literal
result element as root
(http://www.w3.org/TR/xslt#result-element-stylesheet). Well, this
assumtion, if that's the case, is not valid. The literal result root
element must possess xsl:version attribute to be considered valid XSL
stylesheet. So the browser must fail to load page if it tries to use
arbitrary XML as XSL.

Chrome works in that way. FF and IE too.

I, for one, don't see any other reason to not subject XSL stylesheets to the same policy as CSS and JS files.

Comment 56 by Deleted ...@, Aug 9 2012

I can't believe this issue got closed with WontFix
Its ironic that this security feature restricts a local client-side-only GWT-based HTML report from displaying data from XML on local filesystem on Google's own browser. We need to tell our clients to use other browsers as the only known workaround of starting Chrome with parameters like --user-data-dir=D:\location\to\report --allow-file-access-from-files for every report folder is impractical.

The Google engineers might have strong reasons for their decisions, but maybe they can atleast allow a workaround with a special meta-tag which can be included on the HTML to tell chrome to allow local file access, include white-listed relative paths to directories or files, etc.
Dear Google Engineers, 
IMHO it is strange solution to user meta-tags for such reason.
It means that some let's say "dangerous" document itself will refer any destination folder it wants.
Command-line parameter (--user-data-dir=) looks better.
Does it already implemented in current version?
Probable it is also good idea to add a White list to configuration parameters.
And the most convenient way if Chrome will ask me if I allow this locally stored XSLT, and automatically add it (or its folder) the White list.

Comment 59 by fhp.m...@gmail.com, Aug 15 2012

This has been going on for this long? Damn, so my previous job making me use Firefox was a good thing, because I never noticed this BIG LACK.
I understand the security concern, but I think the implementation is way too rigid the way it is.

Why not allow the user to override the default behavior, e.g. with a simple dialog, if the XSLT is in the same directory tree as the XML file?
@rinzedelaat: Unfortunately, users do not make good security decisions, especially in cases like this that are subtle.  Typically, users are most concerned with accomplishing a specific task and will click through whatever security dialogs get in their way.  For this reason, we try to limit the number of security dialogs we present to the user to the absolute minimum.

Comment 62 by fhp.m...@gmail.com, Sep 11 2012

If the user is _so_ adamant on "accomplishing a specific task" that will "click through whatever security dialogs get in their way" - they're gonna do it somehow, right?
I've never even heard of sniffing through XSL, so you could at least allow it for computers not even permanently connected to the internet (if even).

Comment 63 by azar...@gmail.com, Sep 11 2012

The dirty secret here is that this doesn't actually address the real security concern of pages accessing data on the hard drive and sending it to a remote server, except for specific types of data (eg xml).  To address the true security vulnerability, rather than blocking local files from accessing specific types of other local files, they'd be blocked from accessing the internet.  That, however, would be inconvenient to a different set of developers.
Project Member

Comment 64 by bugdroid1@chromium.org, Oct 14 2012

Labels: Restrict-AddIssueComment-Commit
This issue has been closed for some time. No one will pay attention to new comments.
If you are seeing this bug or have new data, please click New Issue to start a new bug.
Project Member

Comment 65 by bugdroid1@chromium.org, Mar 11 2013

Labels: -Area-WebKit -WebKit-Core Cr-Content Cr-Content-Core
Project Member

Comment 66 by bugdroid1@chromium.org, Apr 6 2013

Labels: -Cr-Content Cr-Blink

Comment 67 by kojii@chromium.org, Aug 17 2015

 Issue 111905  has been merged into this issue.

Comment 68 by kojii@chromium.org, Nov 25 2015

 Issue 560334  has been merged into this issue.

Sign in to add a comment