New issue
Advanced search Search tips

Issue 888673 link

Starred by 1 user

Issue metadata

Status: Assigned
Owner:
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: ----
Pri: 2
Type: Bug



Sign in to add a comment

Some Fuchsia perf test results don't show up in perf dashboard

Project Member Reported by mseaborn@chromium.org, Sep 24

Issue description

Steps to reproduce:

 * Go to https://chromeperf.appspot.com/report
 * Select fuchsia.zircon, garnet-x64-perf-swift_canyon, minfs, PathWalk, 1000-Components, Mkdir
    * Result: Graph shows up, as expected -- this case works OK.
 * Select fuchsia.zircon, garnet-x64-perf-swift_canyon, minfs, Bigfile, 16Kbytes, 1024-Ops, 1-Cycle
    * Result: the dashboard just shows a spinner in place of the graph.  The expected result is that a graph appears.
 * Try to select fuchsia.zircon, garnet-x64-perf-swift_canyon, blobfs
    * Result: "blobfs" does not appear in the drop-down.  The expected result is that it does appear.

Is there something about these test case names that causes the dashboard not to work in these cases, perhaps?  Are the test names too long (i.e. too many "/" components)?

For context, these are the Catapult Histogram JSON files that were uploaded for those tests, for a recent build:

https://logs.chromium.org/logs/fuchsia/buildbucket/cr-buildbucket.appspot.com/8934748580300366864/+/steps/fs_bench.catapult_json/0/logs/stdio/0
https://logs.chromium.org/logs/fuchsia/buildbucket/cr-buildbucket.appspot.com/8934748580300366864/+/steps/blobfs_bench.catapult_json/0/logs/stdio/0

They are from this build:
https://ci.chromium.org/p/fuchsia/builders/luci.fuchsia.ci/garnet-x64-perf-swift_canyon/b8934748580300366864

(Here is the corresponding issue in the Fuchsia issue tracker that tracks this problem: https://fuchsia.atlassian.net/browse/IN-638)

 
Cc: simonhatch@chromium.org
Owner: mseaborn@chromium.org
Summary: Some Fuchsia perf test results don't show up in perf dashboard (was: [chromeperf] Catapult: Some Fuchsia perf test results don't show up in perf dashboard)
Thanks for filing!

It looks like you're setting the names of the histograms to contain slashes like this:
minfs/Bigfile/16Kbytes/1024-Ops/1-Cycle/Write
blobfs/128bytes/10Blobs/Api.generate_blob

The chromeperf dashboard expects histogram names to not contain slashes.
It supports test paths with up to 5 components like master/bot/suite/histogram/story,
where the final 'story' component is given by the 'stories' diagnostic.
If you want to group Histogram names or stories, please use colons like this:
{"name": "minfs:Write", "diagnostics": {"stories":{"type":"GenericSet","values":["Bigfile:16Kbytes:1024-Ops:1-Cycle"]}}}
The new UI splits test path components at colons and groups them.

+Simon, should add_histograms fail if any histogram names contain slashes?

Definitely, we fixed this in /add_point a while ago and thought we had fixed it here too. I'll make /add_histograms error out on these.
Before you change Catapult to reject slashes in these names, can you give us some time to change Fuchsia not to use slashes, please?  I don't want our Catapult perf results uploads to suddenly start failing.

Are the naming restrictions documented somewhere?  It would be nice if the disallowing of slashes were documented in https://github.com/catapult-project/catapult/blob/master/docs/histogram-set-json-format.md.

Are there any other characters that are disallowed?

The key "stories" is mentioned in the example JSON in histogram-set-json-format.md, but not in the text.  Is there any more context on what it is intended to mean?  Does it mean the same thing as "Subtest" in the current UI in https://chromeperf.appspot.com?

I am not really sure how we should decide what to put in the "histogram" name field and what to put in the "story" field.
Cc: eakuefner@chromium.org
+eakuefner

Sure, I'll hold off on a change until you make the fuschia side changes.

No, this was overlooked when those docs were written since it's a dashboard restriction, not an issue with histogram set itself. We'll update those docs to add these restrictions.

Ben, do you know of any other character restrictions?


re: stories

Thanks for pointing that out, we'll update the docs. Subtest in the current UI has many meanings, might be better to check out https://v2spa-dot-chromeperf.appspot.com which is a the redesign Ben is working on.

Stories are a telemetry concept, they're meant to give context on the scenario in which a metric was computed.

An example might be that you have e benchmark that computes some metric, foo_benchmark

So you have multiple histograms generated by foo_benchmark for various metrics, foos_per_second, foo_bar_peak_usage, etc.

Then each of those metrics can have different a "story". So you could have 2 histograms both named foos_per_second, one with a story "loading_google_com" and another "loading_youtube_com".

mseaborn: Any update or further questions here?
Cc: -eakuefner@chromium.org
Status: Assigned (was: Untriaged)
This issue has an owner, a component and a priority, but is still listed as untriaged or unconfirmed. By definition, this bug is triaged. Changing status to "assigned". Please reach out to me if you disagree with how I've done this.

Sign in to add a comment