Issue metadata
Sign in to add a comment
|
4.1%-5.1% regression in blink_perf.image_decoder at 535668:535738 |
||||||||||||||||||||
Issue descriptionSee the link to graphs below.
,
Feb 12 2018
📍 Pinpoint job started. https://pinpoint-dot-chromeperf.appspot.com/job/12f5a85d840000
,
Feb 13 2018
📍 Found significant differences after each of 3 commits. https://pinpoint-dot-chromeperf.appspot.com/job/12f5a85d840000 Added base::TimeToISO8601 and deleted all other instances by bratell@opera.com https://chromium.googlesource.com/chromium/src/+/9a3a753efa64db4beecf5007e7253e0d909b9202 Implement SignedExchangeCertFetcher by horo@chromium.org https://chromium.googlesource.com/chromium/src/+/96b8dc5e8455fdb62300184d27ac649a12f335b4 Roll clang 321529:324578 by hans@chromium.org https://chromium.googlesource.com/chromium/src/+/ca5066b992b8a05e6a3df042f2c0e1766e13a565 Understanding performance regressions: http://g.co/ChromePerformanceRegressions
,
Feb 13 2018
+cblume: Looking at the bisect in #3: https://pinpoint-dot-chromeperf.appspot.com/job/12f5a85d840000 Is it possible that this benchmark is too sensitive to changes in Chrome? It does look like there's a real regression at the clang roll, but there do seem to be changes in performance at the other CLs too. Also see bug 811413 where the benchmark has an up-and-down pattern over time.
,
Feb 13 2018
The benchmark is both precise and imprecise. To know which is which you unfortunately have to know a bit about our testing rig. The tests are webpages with javascript running in them. That means they are effected by many things. A change in V8 could show up as a bump in the graph. HOWEVER, the tests can also flag other logged attributes to keep track of. THOSE are the precise attributes. So essentially, you don't want to look at decode-lossless-webp / decode-lossless-webp.html. You want to look at ImageFrameGenerator::decode / decode-lossless-webp.html. In order to have the precise measurements, we also have to include an imprecise graph and just tell people to ignore it. When you look at those two Nexus 5X graphs next to each other, you'll see the ImageFrameGenerator::decode (the precise one) has a much narrower shadowy are between the min and max. https://chromeperf.appspot.com/report?sid=95b01448318f9a22f106432149bd0948564ecf246f45e55caec754c508391576 It seems to be +-5ms on 380ms measurements, or a 2% wiggle room. A regression of 4% on the good graph would be a legit concern. But on the bad graph it is meaningless. On that graph I linked is also a x64 bot which typically sticks around 122-124ms. It seems stable. As far as the Nexus 5X bouncing around and that other bug...I'm not sure what is causing it to bounce. But since the shadowy area looks correct on the good graph, I don't think it is a problem with the test itself. Maybe the device was hot on those runs or something.
,
Feb 14 2018
,
Feb 14 2018
,
Jun 26 2018
The graphs seem to have recovered. |
|||||||||||||||||||||
►
Sign in to add a comment |
|||||||||||||||||||||
Comment 1 by 42576172...@developer.gserviceaccount.com
, Feb 12 2018