New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.

Issue 721987 link

Starred by 1 user

Issue metadata

Status: WontFix
Owner: ----
Closed: May 2017
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: ----
Pri: 3
Type: Bug



Sign in to add a comment

.expected files are not generated for recipe_modules config.py tests

Project Member Reported by shenghua...@chromium.org, May 13 2017

Issue description

[What is the issue]
The configs.py tests for upstream recipe_modules are lack of corresponding 'configs.expected' directory with result json files.

E.g. for the configs.py: 
https://cs.corp.google.com/chromium_build/scripts/slave/recipe_modules/chromium_android/tests/configs.py 
it doesn't have the .expected folder
//scripts/slave/recipe_modules/chromium_android/tests/configs.expected/

[Where it happens]
This happens for all the config.py tests files in upstream //scripts/slave/recipe_modules, or as I observed, it at least happens at:

//scripts/slave/recipe_modules/chromium/tests/
//scripts/slave/recipe_modules/chromium_android/tests/
//scripts/slave/recipe_modules/chromium_tests/tests/

[More found]
Locally tried to train with 'recipes.py' and failed to generate those files.
Run the command locally
    `scripts/slave/recipes.py test train --filter chromium_android:tests/configs.*`
it only shows result 'Ran 17 tests', without generates the expected json files.

 
Components: -Infra Infra>Platform>Recipes
Good catch, I bet this is because 'config.py' is treated specially at the module level. 
Cc: phajdan@google.com
Status: WontFix (was: Untriaged)
We're deliberately using DropExpectation for these tests.

Instead of a bug, consider posting to one of infra MLs for an advice on what you're trying to do (make sure to explain what the high-level goal is; the expectation files were probably supposed to be means towards some end).
I recommended filing a bug because I expected the configs.py test to have expectations, though I wasn't familiar with DropExpectation previously.

What's the point of tests w/ dropped expectations? Just ensuring that a test doesn't hit an exception?
Cc: dpranke@chromium.org
This was added for strict recipe module coverage, see https://groups.google.com/a/google.com/d/msg/chrome-infrastructure-team/rbbqZbsZybU/JW9p1maoBQAJ

As described in https://chromium.googlesource.com/chromium/tools/build/+/cbdd57e4312268322ab200b9fa7b7f91f65d7374 , "This CL is focused on adding coverage and preventing further regressions.
If we want these tests to "test" more, further work may be needed."

I'd like to highlight these tests were not present at all before above CL.

To help distribute this work, you're free to further improve the tests.

Let me know if/how I can help. I'm still somewhat guessing there's a higher-level goal. In general, we're trying to move away from expectation files. If you have concerns about that, please consider talking to Dirk (feel free to keep me in the loop).
Preventing further regressions in ... test coverage?

And yeah, I know that most of the recipe_modules/foo/tests/* tests are new, and I appreciate the work you're putting into them. My bug recommendation was primarily because I was surprised & didn't realize this was intentional. I don't mean to suggest that DropExpectation is wrong; I'm just looking to understand the reasoning behind it as opposed to something like post_process.Filter.
DropExpectation can be used when all you're really trying to ensure is that the python code was executed (and perhaps ran without raising an exception), but you don't care what the output was. This, perhaps obviously, isn't a very good kind of test but we have lots of such tests today. Ideally each test would instead use Filters to assert on *something*. Failing that, there should at least be comments by each test describing what coverage is needed.
I'm not sure if a bug is best format for further discussion about this. I'd like to make sure I respond somehow to your feedback/concerns. I'm not sure what are possible actionable things here though.

To clarify preventing regression point: once we drop DISABLE_STRICT_COVERAGE from a module, all further changes have strict coverage enforced. See https://groups.google.com/a/chromium.org/d/msg/infra-dev/IjC2D257nT4/b6v8pAeQAwAJ for an example where a CL has been updated to work with strict coverage.

Converting modules to provide strict coverage is a large effort, and as far as I know I'm the only person working on it right now. Focusing on covering modules helps distribute further work, such as providing strict coverage for new code.

If you have specific suggestions what to do, feel free to share.

If you want, we could also schedule a VC or just discuss on IM.

Sign in to add a comment