CacheStorage.keys() always fails with "operation too large"
Reported by
con...@superhuman.com,
Feb 3 2017
|
|||
Issue descriptionUserAgent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36 Steps to reproduce the problem: 1. Fill your cache storage with a large number of entries (I noticed this in production with ~28k) 2. Run cache.keys() 3. It will throw an "operation too large" error. You can test this here (click the button and wait 30 seconds while it adds 30,000 entries to the cache and then calls cache.keys()): https://rawgit.com/ConradIrwin/d1f30f90cb5e9f77856e8600bac07583/raw/d1221654fa8af9f33faea6eb6aaca5e9a9d62412/storage.html What is the expected behavior? It should return an array of the keys in CacheStorage. What went wrong? There's no way to iterate over the keys in large caches. Did this work before? N/A Does this work in other browsers? N/A Chrome version: 56.0.2924.87 Channel: stable OS Version: OS X 10.12.2 Flash Version: Shockwave Flash 24.0 r0 I'm not sure whether this is a bug in Chrome, or a bug in the spec as it seems that there's no provision for pagination in the .keys() API, and I can imagine the need to put some limits on the number of keys returned as they're fairly heavyweight javascript objects. Filed as https://github.com/w3c/ServiceWorker/issues/1066 for discussion.
,
Feb 3 2017
This isolates the read time from the write time:
function fill(cache) {
let list = [];
for (let i = 0; i < 30000; ++i) {
list.push(cache.put('https://example.com/probably-crash-' + i, new Response('ok')))
}
return Promise.all(list);
}
function read(cache) {
let start = performance.now();
cache.keys().then(keys => {
let end = performance.now();
console.log(end - start);
})
}
caches.open('foo').then(cache => { fill(cache).then(_ => read(cache) ) })
,
Feb 3 2017
Thanks for pointing that out! In this case the primary problem is the crash (as this is a very rare event, and async operation, the time is less relevant). I've published an updated test here: https://rawgit.com/ConradIrwin/a634359060cbbde0e90a30e5744aa1a1/raw/94c524dcc451f7104b5fdf8999a56cc70db86ff2/storage.html It still reliably crashes with "Operation too large"
,
Feb 4 2017
"crashes" - actual browser or renderer crash, or just an exception?
,
Feb 4 2017
Just an exception, sorry for the confusing. It correctly returns
query_cache_context->callback.Run(CACHE_STORAGE_ERROR_QUERY_TOO_LARGE,
std::unique_ptr<QueryCacheResults>());
Which is surfaced as a javascript exception "Operation too large"
,
Feb 4 2017
That error is an unfortunate hack to protect the browser from using too much memory. Until we rewrite CacheStorage on the browser side to stream responses back to the renderer, we simply limit the number of responses we process on the browser side so that it doesn't get too big. :[
,
Feb 4 2017
Note that I'm referring to the size of the response headers being an issue, we *do* stream the response body bytes.
,
Feb 4 2017
Thanks! What's the target issue? It's unfortunately hidden :(
,
Feb 6 2017
Same thing as I described in comment 6 basically. |
|||
►
Sign in to add a comment |
|||
Comment 1 by bke...@mozilla.com
, Feb 3 2017