Bisect/Perf Dashboard should be better at handling noisy metrics |
||
Issue descriptionWe have some noisy metrics that are very hard to bisect on. This can lead to confusion and thus wasted developer time. Example: crbug.com/647635 One possible solutions: A) Have a noisiness grade for metrics. ex: if a metric has a regression that is smaller than/similar to the variance between normal runs, we are much less likely to get useful bisects from it. Maybe we should notice that and 1) Not bisect on it, or 2) Show a warning message on the performance dashboard when starting a bisect. And/or 3) Have the bisect output explicitly mention whether the metric being bisected on is considered to be noisy, and may have inconsistent/incorrect results. B) Be better at determining what to bisect on when alerts are triaged.
,
Sep 27 2016
Agreed. Did we use Dave's script on the bisects in bug 647635 ? Eyeballing the numbers the bisect spit out for averages, two of them seem to have pretty sharp divides between "good" and "bad". Not sure what could be causing that.
,
Jan 10
Archiving issues older than 2 years with no owner or component. |
||
►
Sign in to add a comment |
||
Comment 1 by dtu@chromium.org
, Sep 27 2016