New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.
Starred by 3 users

Issue metadata

Status: WontFix
Last visit > 30 days ago
Closed: Feb 2012
EstimatedDays: ----
NextAction: ----
OS: Windows
Pri: 2
Type: Bug

  • Only users with Commit permission may comment.

Sign in to add a comment

Issue 110649: Browser not caching files if HTTPS is used even if it's allowed by webserver via response headers

Reported by, Jan 18 2012

Issue description

Chrome Version        : 16.0.912.75
OS Version            : 6.1 (Windows 7, Windows Server 2008 R2)
Other browsers tested :
     Safari 5: Not tested
     Firefox 4.x: OK
     IE 7/8/9: OK

What steps will reproduce the problem?
1. Create simple html page - in my case with javacript and silverlight application (.xap file) 
2. Expose it on the webserver with configured HTTPS
   - use all possible (and a few impossible) caching headers:
       Cache-Control: max-age=31536000, public, must-revalidate, proxy-revalidate
       Pragma: public
       Expires: <current datetime + 1 year>
       ETag: 123456789
       Last-Modified: <current datetime>
3. Access the page from Chrome via HTTPS
4. Refresh browser

What is the expected result?
  After refresh of the webpage "If-None-Match" or rather "If-Modified-Since" headers should be present in the GET requests. OR if max-age/Expires should cause hard caching, but it is not necessary ant it is not the point of my report.

What happens instead?
  No request 'caching' headers in GET requests and all files*), especially big .xap file, are downloaded from the server again and again with each next refresh.

*) except .js which is sometimes! cached

Comment 1 by, Jan 21 2012

Labels: -Area-Undefined Area-Internals Internals-Network
Status: Untriaged

Comment 2 by, Jan 21 2012

There is no info, how to enable caching in Chrome over https. I used all mentioned headers without success. It's great, that some heuristics is used for caching decision, but it's obviously not working.

Silverlight application (.xap) or other static resources (e.g. images) can be significantly big and download of all resources after each page refresh makes it unuseable.

Comment 3 by, Jan 23 2012

Labels: -Internals-Network Internals-Network-Cache
Status: Assigned

Comment 4 by, Jan 24 2012

Do you mind following and posting the result here?

In particular, there should be a first request that caches a resource that was not previously cached, followed by the simple refresh.

Comment 5 by, Jan 25 2012

There is initial load of the page and two refreshes - only in case of .js file are used  If-Modified-Since & If-None-Match headers in second and third request, other files are downloaded each time.
235 KB View Download

Comment 6 by, Jan 27 2012

In fact nothing is being cached (on the persistent cache)... are you using a self-signed certificate and going through the interstitial? Any kind of SSL error will prevent us from caching anything from that site.

Comment 7 by, Feb 7 2012

Great, that's it! So, now the problem is much less serious.
But it is still strange behaviour, e.g. for small business is internal usage of self-signed certificates common practice, not to mention troubles in development.

+ It's different from other browsers
+ It's unpredictable - I have no idea why/how (in)validity of certificate can affect cache behaviour?

Comment 8 by, Feb 7 2012

Status: WontFix
The rule is actually quite simple: any error with the certificate means the page will not be cached.

See comment 8 on bug 103875 for more context.

Comment 9 by, Oct 13 2012

Project Member
Labels: Restrict-AddIssueComment-Commit
This issue has been closed for some time. No one will pay attention to new comments.
If you are seeing this bug or have new data, please click New Issue to start a new bug.

Comment 10 by, Mar 10 2013

Project Member
Labels: -Area-Internals -Internals-Network-Cache Cr-Internals-Network-Cache Cr-Internals

Sign in to add a comment