[zlib] Investigate telemetry performance page set tests for content-encoding: gzip |
|||
Issue descriptionInvestigate what page sets for top sites are already in Telemetry [1]. Also from https://bugs.chromium.org/p/chromium/issues/detail?id=823118#c8 [1] https://chromium.googlesource.com/catapult.git/+/master/telemetry/README.md "Maybe contact Telemetry team to add a metric to be monitored by our loading benchmarks, which runs Chrome against real-world websites. Telemetry benchmarks are more noisy (compared to net perf tests issue 825056), but will allow you to exercise top sites that use zlib." Inspecting www.google.com in the network tab of the devtools, for example, I see a br (brotli) compressed main page, about 6-8 PNG images, some PNG as data urls, and 3-8 gzipped js files (depends on the country-specific landing site) [2]. [2] http://googlecode.blogspot.com/2009/11/use-compression-to-make-web-faster.html
,
Mar 23 2018
,
Mar 23 2018
+nednguyen@: Could you advise on how we can go about doing this? Telemetry benchmarks are already exercising "content-encoding:gzip" for sites that use gzip. WebPageReplay will replay web pages with the same content encoding back to Chrome. Gzip is a supported encoding type. We are yet to add Brotli support (https://github.com/catapult-project/catapult/issues/3742). I think the only thing needed here is to add a tracing probe that records the zlib metric that you want and tell Telemetry to monitor that. |
|||
►
Sign in to add a comment |
|||
Comment 1 by noel@chromium.org
, Mar 23 2018