Add a way to disable all driver bug related test failure expectations in GPU test runner
Reported by
oetu...@nvidia.com,
Mar 10 2017
|
||||
Issue descriptionChrome Version: TOT OS: All What steps will reproduce the problem? (1) content/test/gpu/run_gpu_integration_test.py There should be some way to disable the driver bug related expectations in the above test runner. This way the test runner can be used together with test filters that are maintained outside of Chromium to test future driver releases that have more fixes than Chromium is aware of. The most interesting test suite in this regard is WebGL conformance. The test expectations for WebGL conformance tests are in src\content\test\gpu\gpu_tests\webgl_conformance_expectations.py. Currently there's no simple distinction between expectations that are there due to browser bugs or fundamental platform limitations, and expectations that are there due to driver bugs. The way the test expectations are loaded is also fairly complex and hidden beneath several layers from the top level runner script, so it's not easy to make customizations to the runner script that would disregard some of the expectations.
,
Mar 14 2017
I was aware of the possibility of disabling driver bug workarounds in Chromium, and this is something we will put into use. What this bug is about is disabling driver-specific failure expectations. Roughly speaking, the expectations can be split into two categories: A) Related to browser bugs or fundamental platform limitations. B) Related to bugs in specific drivers or HW. We'd like to keep expectations from category A, but disable all expectations from category B. It wouldn't be a good solution for us to maintain a separate expectations file with just category A, as every time we would update the browser and runner we use from upstream we'd have to spend time re-evaluating which expectations we need to keep and which to remove. It would be much better for us if this split would exist in upstream Chromium. This wouldn't necessarily need to have any complex infrastructure behind it - even if the expectations would be just split into two functions in the expectations files, it would make it trivial for us to disable category B expectations whenever we get an update from upstream. And I'd imagine that when people working on Chromium add new expectations, it would be quite trivial to put them into the right category. Do you think that something like this would be a good solution?
,
Mar 14 2017
I see the reasoning for this feature request and I agree it's quite useful for driver vendors. Usually I would think if an entry is specific to a vendor, that's most likely a driver issue instead of a chrome issue - of course, there are exceptions to that.
,
Mar 15 2017
Olli, for your use case, would Mo's idea work to basically disable all test expectations which are specific to a given GPU vendor? This would save a lot of effort in re-evaluating all of the test expectations and annotating them.
,
Mar 15 2017
That might be good enough - I can imagine some counterexamples like extension availability tests that could be vendor-specific, but mostly it should work. Should there be a new flag for the test runner that would disable GPU vendor specific expectations?
,
Mar 16 2017
Yes, that would be the best way to implement it. It should be added as: @classmethod def AddCommandlineArgs(cls, parser): to gpu_integration_test.py. Note that a few of the subclasses (CloudStorageIntegrationTestBase, InfoCollectionTest, WebGLConformanceIntegrationTest) will have to be changed to call "super" in that method. It's going to be tricky to plumb this through to the TestExpectations class at the right time. It might be necessary to hoist SetParsedCommandLineOptions and GetParsedCommandLineOptions from CloudStorageIntegrationTestBase up to GpuIntegrationTest, call SetParsedCommandLineOptions from GenerateTestCases__RunGpuTest, and get the options in GpuIntegrationTest.GetExpectations. Pass an argument to _CreateExpectations indicating whether to filter out GPU-specific expectations, pass it up TestExpectations' inheritance hierarchy, and handle it in GpuTestExpectations._ExpectationAppliesToTest. This is going to make already-complex logic even more complex, so please be careful about writing unit tests for it. I wonder whether before doing all of this that we should first verify that it'll be feasible for you to maintain your own separate set of test expectations. It might be easier for you to just maintain your own copy of webgl_conformance_expectations.py and webgl2_conformance_expectations.py, and periodically update and merge them when you update Chromium.
,
Apr 2 2018
This issue has been Available for over a year. If it's no longer important or seems unlikely to be fixed, please consider closing it out. If it is important, please re-triage the issue. Sorry for the inconvenience if the bug really should have been left as Available. For more details visit https://www.chromium.org/issue-tracking/autotriage - Your friendly Sheriffbot
,
Apr 2 2018
,
Oct 9
|
||||
►
Sign in to add a comment |
||||
Comment 1 by kbr@chromium.org
, Mar 14 2017Components: Internals>GPU>Testing