New issue
Advanced search Search tips

Issue 757191 link

Starred by 4 users

Issue metadata

Status: Duplicate
Merged: issue 655585
Owner: ----
Closed: Aug 2017
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: Windows
Pri: 2
Type: Bug



Sign in to add a comment

Weirdness in parallel downloads of JS files in HTTP/2

Reported by ba...@tunetheweb.com, Aug 19 2017

Issue description

UserAgent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.101 Safari/537.36

Example URL:
https://www.tunetheweb.com/experiments/test_async.html

Steps to reproduce the problem:
1. Open Developer Tools
2. Visit https://www.tunetheweb.com/experiments/test_async.html which downloads 25 copies of an empty Javascript file with a query param to fool the browser into thinking they are 25  different resources.
3. Note, from the Waterfall, that the JS is downloaded in groups of 6 despite being over HTTP/2 (see also screenshot from webpagetest here: https://i.stack.imgur.com/MOLkK.png)
4. Visit https://www.tunetheweb.com/experiments/test_js.html which is the same file but without the async attribute.
5. Note that the JS is downloaded together (https://i.stack.imgur.com/Qr6J9.png).

What is the expected behavior?
Multiplexing in HTTP/2 should NOT be limited to 6 parallel downloads no matter whether async attribute is used or not. In fact I would have thought async should actually download more in parallel!

What went wrong?
Only 6 JS files are downloaded at once when the async or defer attributes is used (incorrect behaviour).

It doesn't appear to be limited for images, nor for plain javascript scripts (without async or defer) so why is it for async and/or defer?

It does not appear that Firefox, nor Edge have this restriction, but Opera does too (which is also Chromium based).

Did this work before? N/A 

Chrome version: 60.0.3112.101  Channel: stable
OS Version: 10.0
Flash Version: 

Cannot reproduce in Canary (62.0.3190.0) however when using Canary on web page test (62.0.3190.1) it DOES reproduce it (https://www.webpagetest.org/result/170819_9S_5ed1f532487f6622cd3b63a184dece15/1/details/#waterfall_view_step1).
 

Comment 1 by da...@wix.com, Aug 21 2017

This behavior can cause significant performance degradation on Chrome, as it undermines the benefit of using HTTP/2 with JavaScript modules. Please address ASAP

Comment 2 by mmenke@chromium.org, Aug 21 2017

Cc: jkarlin@chromium.org
I believe this behavior was added in https://chromiumcodereview.appspot.com/2435743002 and https://codereview.chromium.org/2739433002 - the throttling should only affect only resources that don't block the page load, I think.
What you're seeing is working as intended for the time being. It turns out that simply starting a request is resource intensive (getting cookies, checking the cache, dns lookups, etc). If you allow hundreds or thousands of requests to start in parallel you wind up delaying high-priority requests (script and css) on low-priority ones. It leads to bad loading performance. By throttling H2+QUIC resources we've seen considerable improvements in our page loading metrics. 

I agree that it's not ideal. What we really want is to address the priority inversion issue. But until that's done this is what we've got.
So by NOT setting async or defer it's defined as blocking and hence doesn't have this limit. However by setting async or defer you are intentionally setting it as non-blocking/low priority and hence Chrome puts in this limit.

Is my understanding correct?

While I (somewhat) understand the reasoning based in your comment, it does seem counter intuitive - those websites that do the right thing and make sure their scripts don't block the page loads, ultimately suffer from not having full functionality on the website until later than "poorly written" websites that do block.

Until Chrome addresses the underlying need for this, is 6 the right limit? While I agree hundreds or thousands of requests would cause issues, limiting to the old HTTP/1.1 limit of 6, when HTTP/2 is, in theory at least, much less resource intensive seems wrong. I think if this was increased somewhat (10? 20?) then it would likely affect fewer sites and perhaps be less of an issue. No idea how to define what that limit should be, and maybe you've considered this and tested and decided 6 is the best limit for now?

Or perhaps it could be based on the current items in the queue. For example, in my simple examples, there should be nothing in the queue and so it can afford to be more aggressive. On more complex sites, which already have a huge number of items in the queue, maybe you throttle back more. Not sure if that would be quite complicated to implement but just throwing ideas out here.

I would imagine HTTP/2 sites would like to stop having to bundle JS files but this bug severely limits their ability to do that - especially when it's not obvious that Chrome has this artificial limit.

Comment 5 by da...@wix.com, Aug 21 2017

While I understand the reasoning, I find the decision to be very problematic. It's one thing to try to best support bad design, it's quite another to reward it, and penalize good design. We've worked hard to avoid all render-blocking resource downloads, and now we are penalized for it on Chrome.

Based on my understanding, for Chrome best performance is now achieved by avoiding HTTP/2 in favor of HTTP/1.1. In both cases download is limited to 6 at a time, and at least with HTTP/1.1 we also get 6 sockets.
I hear your frustrations. Please see issue 655585 for the context on the decision. Again, it's not where we intend to stay. It's just where we are now. The network stack is gradually being updated with a task scheduler that allows for prioritization. Once that transition completes, we can experiment with prioritized tasks and removing the throttle.
Mergedinto: 655585
Status: Duplicate (was: Unconfirmed)
Closing as duplicate, any updates will be posted to 655585.

Sign in to add a comment