Pylint likely has confusing behavior under vpython |
|||
Issue descriptionBut I don't think we've seen this failuremode yet. Currently the way pylint works is that we run, essentially: vpython //depot_tools/pylint.py //repo/file1.py //repo/file2.py //repo/dir/file3.py If you look closely, you'll see that this would operate under depot_tools' .vpython spec, NOT under the repo's vpython spec. I think for this to work "correctly", we'd actually want to break all the files in repo into "domains", each of which corresponds to the same .vpython spec file in the repo. We'd then want to issue: vpython -vpython-spec //repo/file1.py.vpython //depot_tools/pylint.py //repo/file1.py vpython -vpython-spec //repo/.vpython //depot_tools/pylint.py //repo/file2.py //repo/dir/file3.py This get especially gnarly with script-embedded vpython specs (though not totally insurmountable). My major qualm here is that this would require a substantial amount of smarts in depot_tools to do this correctly :(. Another monkeywrench here would be if we wanted to start bundling pylint as a .vpython wheel in depot_tools. If we did that then we'd need to do something like... synthesize new vpython specs which are whatever-is-in-the-repo-plus-pylint, and then deal with potential conflicts if the repo itself includes pylint... Thoughts?
,
Mar 24 2018
To be clear... I don't think this ever had NON-confusing behavior. pylint has always been a bit mystical (especially with all the sys.path shenanigans that happen in repos)
,
Mar 25 2018
Quick thoughts: Firstly, we could always allow vpython to load multiple specs and merge them, e.g. "-vpython-spec /my/repos/thing", "-vpython-spec /path/to/pylint.vpython". The merging logic is already supported (normalization removes duplicates, and merging is just appending), and is used in the (currently unused AFAIK) BaseWheels feature: https://chromium.googlesource.com/infra/luci/luci-go/+/e17d0d98846816f121b0a051dfc30d7894fa67e3/vpython/options.go#102 With respect to domains, I don't really see a way around this. The easy answer would be to not have nested vpython specs in the same pylint invocation (i.e., repo), but that's not a good solution; it just avoids the problem. This gets even trickier when you consider that vpython has its own logic for finding specs, and that includes filesystem stuff (which depot_tools could probably handle) as well as embedding and partner files ... that's a mess. One option would be to implement an analytic tool in vpython, something like: vpython -vpython-tool manifest /path/to/file.. /path/to/dir... ...would dump/output a JSON with all of the identified Python files grouped by domains and spec info. This is a bit of work though, but probably the only good way to get honest pylint behavior out of things.
,
Mar 25 2018
RE #3, we'd also need to enhance vpython's "-vpython-spec" command to be able to load from inline files. This should be trivial though, but AFAIK ATM it only accepts actual spec files as source. Logic would go: 1) Try loading as text protobuf. 2) Fallback to loading as inline script. 3) Maybe there's actually an error. Or could switch between (1) and (2) based on extension, but I like the fallback method better - more resilient.
,
Mar 28 2018
Yeah ok, so I think the solution will be:
* Add a vpython analysis tool. This tool would take a list of python files and return a JSON description of:
* The group of files included
* The path to the vpython spec for that group of files
* Add a directive to the vpython spec "merge if no conflict" which would ignore the spec if merging it would result in a conflict.
* Have depot_tools' pylint support:
* have a pylint.vpython spec (remove vendored pylint), which has the 'merge if no conflict' bit
* call vpython to do domain analysis, falling back to a single domain if vpython is being bypassed
* call a checked-in pylint wrapper (i.e. import pylint && run pylint), per domain, with two vpython specs, the repo's spec and pylint.vpython. If the user is bypassing vpython, then they'll have to have pylint and all relevant packages installed to the system, but this wrapper approach should still work.
This is a fair amount of work, but it would also mean that pylint would work better than it ever has before.
,
Mar 28 2018
Possible shortcut: Use BaseWheels to embed pylint into the infra-shipped `vpython`. I'd be really nice to control this from depot_tools though, and not in the compiled vpython binary. Would still have to implement domain analysis tool though.
,
Mar 29 2018
+ehmaldonado as you seem to be in the general vicinity of presubmit support these days :)
,
Mar 29 2018
RE #5, presubmit could still use PYTHONPATH to include the vendored Pylint. This is probably less great than using a wheel version, but it does mean that the solution only requires the manifest portion. Work would be: 1) Make it so "-vpython-spec" can load inline-style specs (#4). 2) Write manifest generation subcommand. 3) Have depot_tools Presubmit code generate a manifest for a set of candidate files (2), and then, for each spec group, run Pylint on that spec, something like: vpython -vpython-tool generate-spec-manifest files..... (or maybe files should be STDIN JSON input so we aren't limited by command line length limits) 3a) If we want to forego the Pylint wheel in favor of using the vendored version, (3) would use PYTHONPATH, something like: PYTHONPATH=/path/to/depot_tools/pylint vpython -vpython-spec /path/to/spec/for/manifest-group.vpython -m pylint files... 3b) If we want to enable spec merging, (3) would use the Pylint wheel, something like: vpython -vpython-spec /path/to/spec/for/manifest-group.vpython -vpython-spec-add /path/to/depot_tools/pylint.vpython -m pylint files... NOTE: (3a) could be turned into (3b) with only a change to depot_tools. WDYT?
,
May 4 2018
|
|||
►
Sign in to add a comment |
|||
Comment 1 by iannucci@chromium.org
, Mar 24 2018