Set up a bot for continuous building and generating code coverage information |
|||||||
Issue descriptionStart with some unit tests.
,
Nov 17 2017
Some info from running the tests and generating the report: "unit_tests --test-launcher-jobs=1" took ~30 minutes to run with coverage "llvm-profdata merge ...." took ~1.5 minutes "llvm-cov show ...." ~8 hours Executable file sizes: out/test/unit_tests: 220,571,576 B out/coverage/unit_tests: 5,632,441,944 B Coverage file sizes: .profraw: 2,191,792,448 B .profdata: 7,448,176 B Output dir size: 2.8 GB I used adb61db19020ed8ecee5e91b1a0ea4c924ae2988 revision for that, which is r508578 -- branch base commit for beta M63.
,
Nov 17 2017
I didn't expect "llvm-cov show" would take this long!!! Is it because it needs to process too many files? If "unit_tests" is only meaningful for Chrome, then maybe we shouldn't care about folders other than src/chrome/? I think passing a source filter to "llvm-cov show" may speed it up.
,
Nov 17 2017
Yes, I think it's because the number of files. Yeah, using filter should work. Another unittests executable: out/coverage/content_unittests 4,271,718,488 B out/test/content_unittests 143,764,872 B Execution time: no coverage: ~6.5 min with coverage: ~17.5 min llvm-profdata merge: ~1 minute size of coverage files: default.profdata 7,829,656 B default.profraw 1,736,165,056 B Kicked off report generation.
,
Nov 17 2017
I'm glad that llvm-profdata merge works relatively fast when processing two cov. dumps. It took less than ~2.5 min to merge two .profraw files generated above. I've kicked off report generation. If it still takes ~8 hours (I hope so), that would mean that we can merge lots of different stuff from .profraw files in a reasonable time.
,
Nov 20 2017
~7.2h on report generation for content_unittests: $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=content_coverage -instr-profile=default.profdata out/coverage/unit_tests real 432m48.190s user 386m47.012s sys 46m0.292s ~8h on report generation for merged data of unit_tests and content_unittests, pretty much the same as unit_tests standalone: $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=unit_and_content_coverage -instr-profile=unit_and_content.profdata out/coverage/unit_tests -object=out/coverage/content_unittests real 481m9.253s user 432m0.120s sys 49m3.472s
,
Nov 21 2017
Trying some other unittests, all of them are very small though: ------------------------------------------------------------ Running without coverage: $ time out/test/cc_blink_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 0 seconds. real 0m0.180s user 0m0.008s sys 0m0.144s $ time out/test/crypto_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 1 seconds. real 0m2.172s user 0m0.708s sys 0m0.416s $ time out/test/mojo_common_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 0 seconds. real 0m0.176s user 0m0.024s sys 0m0.112s $ time out/test/pdf_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 0 seconds. real 0m0.586s user 0m0.080s sys 0m0.252s $ time out/test/sql_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 5 seconds. real 0m5.349s user 0m0.236s sys 0m0.468s $ time out/test/breakpad_unittests --test-launcher-jobs=1 [ PASSED ] 135 tests. real 0m2.378s user 0m5.672s sys 0m0.476s $ time out/test/swiftshader_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 0 seconds. real 0m0.166s user 0m0.020s sys 0m0.120s ------------------------------------------------------------ With coverage: $ LLVM_PROFILE_FILE=cc_blink_unittests.profraw time out/coverage/cc_blink_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 0 seconds. 0.03user 12.12system 0:12.20 $ LLVM_PROFILE_FILE=crypto_unittests.profraw time out/coverage/crypto_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 2 seconds. 0.74user 0.59system 0:02.50elapsed $ LLVM_PROFILE_FILE=mojo_common_unittests.profraw time out/coverage/mojo_common_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 0 seconds. 0.02user 0.41system 0:00.45elapsed $ LLVM_PROFILE_FILE=pdf_unittests.profraw time out/coverage/pdf_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 1 seconds. 0.06user 3.01system 0:03.41elapsed $ LLVM_PROFILE_FILE=sql_unittests.profraw time out/coverage/sql_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 5 seconds. 0.26user 0.80system 0:05.55elapsed $ LLVM_PROFILE_FILE=breakpad_unittests.profraw time out/coverage/breakpad_unittests --test-launcher-jobs=1 [ PASSED ] 135 tests. 6.28user 0.60system 0:02.67elapsed $ LLVM_PROFILE_FILE=swiftshader_unittests.profraw time out/coverage/swiftshader_unittests --test-launcher-jobs=1 SUCCESS: all tests passed. Tests took 0 seconds. 0.02user 0.54system 0:00.57elapsed
,
Nov 21 2017
.profraw file sizes: 990M cc_blink_unittests.profraw 9.6M crypto_unittests.profraw 13M mojo_common_unittests.profraw 150M pdf_unittests.profraw 10M sql_unittests.profraw 1.1M breakpad_unittests.profraw 17M swiftshader_unittests.profraw llvm-profdata merge is fast, and .profdata files are super small: $ for f in cc_blink_unittests crypto_unittests mojo_common_unittests pdf_unittests sql_unittests breakpad_unittests swiftshader_unittests; do > echo $f; > time third_party/llvm-build/Release+Asserts/bin/llvm-profdata merge -sparse $f.profraw -o $f.profdata; > ls -sH $f.profdata; > done cc_blink_unittests real 0m30.707s user 0m30.164s sys 0m0.540s 2224 cc_blink_unittests.profdata crypto_unittests real 0m0.464s user 0m0.456s sys 0m0.004s 1316 crypto_unittests.profdata mojo_common_unittests real 0m0.632s user 0m0.608s sys 0m0.020s 2424 mojo_common_unittests.profdata pdf_unittests real 0m5.669s user 0m5.472s sys 0m0.192s 1288 pdf_unittests.profdata sql_unittests real 0m0.468s user 0m0.436s sys 0m0.028s 1552 sql_unittests.profdata breakpad_unittests real 0m0.099s user 0m0.092s sys 0m0.008s 760 breakpad_unittests.profdata swiftshader_unittests real 0m0.766s user 0m0.716s sys 0m0.048s 1648 swiftshader_unittests.profdata
,
Nov 21 2017
Interestingly, swiftshader_unittests failed to generate coverage: $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=swiftshader_unittests_coverage -instr-profile=swiftshader_unittests.profdata out/coverage/swiftshader_unittests error: out/coverage/swiftshader_unittests: Failed to load coverage: Invalid instrumentation profile data (bad magic) I inspected the log of test run: IMPORTANT DEBUGGING NOTE: batches of tests are run inside their own process. For debugging a test inside a debugger, use the --gtest_filter=<your_test_name> flag along with --single-process-tests. Using sharding settings from environment. This is shard 0/1 Using 1 parallel jobs. Note: Google Test filter = SwiftShaderTest.Initalization [==========] Running 1 test from 1 test case. [----------] Global test environment set-up. [----------] 1 test from SwiftShaderTest [ RUN ] SwiftShaderTest.Initalization [ OK ] SwiftShaderTest.Initalization (7 ms) [----------] 1 test from SwiftShaderTest (7 ms total) [----------] Global test environment tear-down [==========] 1 test from 1 test case ran. (7 ms total) [ PASSED ] 1 test. [1/1] SwiftShaderTest.Initalization (7 ms) SUCCESS: all tests passed. The test seems to be run in a separate process. When I re-run it with --single-process-tests, llvm-cov worked successfully: $ LLVM_PROFILE_FILE=swiftshader_unittests.profraw time out/coverage/swiftshader_unittests --test-launcher-jobs=1 --single-process-tests [==========] Running 1 test from 1 test case. [----------] Global test environment set-up. [----------] 1 test from SwiftShaderTest [ RUN ] SwiftShaderTest.Initalization [ OK ] SwiftShaderTest.Initalization (6 ms) [----------] 1 test from SwiftShaderTest (6 ms total) [----------] Global test environment tear-down [==========] 1 test from 1 test case ran. (6 ms total) [ PASSED ] 1 test. 0.00user 0.05system 0:00.05elapsed 98%CPU (0avgtext+0avgdata 24660maxresident)k 0inputs+17208outputs (0major+3497minor)pagefaults 0swaps $ time third_party/llvm-build/Release+Asserts/bin/llvm-profdata merge -sparse swiftshader_unittests.profraw -o swiftshader_unittests.profdata real 0m0.434s user 0m0.420s sys 0m0.016s 980 swiftshader_unittests.profdata $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=swiftshader_unittests_coverage -instr-profile=swiftshader_unittests.profdata out/coverage/swiftshader_unittests real 0m8.967s user 0m7.084s sys 0m1.868s
,
Nov 21 2017
Another way of handling that is to use %p specifier in LLVM_PROFILE_FILE name and then merge all .profraw files together: $ LLVM_PROFILE_FILE=swiftshader_unittests.%p.profraw time out/coverage/swiftshader_unittests --test-launcher-jobs=1IMPORTANT DEBUGGING NOTE: batches of tests are run inside their own process. For debugging a test inside a debugger, use the --gtest_filter=<your_test_name> flag along with --single-process-tests. Using sharding settings from environment. This is shard 0/1 Using 1 parallel jobs. Note: Google Test filter = SwiftShaderTest.Initalization [==========] Running 1 test from 1 test case. [----------] Global test environment set-up. [----------] 1 test from SwiftShaderTest [ RUN ] SwiftShaderTest.Initalization [ OK ] SwiftShaderTest.Initalization (7 ms) [----------] 1 test from SwiftShaderTest (7 ms total) [----------] Global test environment tear-down [==========] 1 test from 1 test case ran. (7 ms total) [ PASSED ] 1 test. [1/1] SwiftShaderTest.Initalization (7 ms) SUCCESS: all tests passed. Tests took 0 seconds. 0.02user 0.05system 0:00.07elapsed $ ls -l swiftshader_unittests.* -rw-r----- 1 mmoroz eng 8809096 Nov 21 13:39 swiftshader_unittests.91834.profraw -rw-r----- 1 mmoroz eng 8809096 Nov 21 13:39 swiftshader_unittests.91836.profraw $ time third_party/llvm-build/Release+Asserts/bin/llvm-profdata merge -sparse swiftshader_unittests.*.profraw -o swiftshader_unittests.merged.profdata real 0m0.799s user 0m0.752s sys 0m0.036s $ ls -l swiftshader_unittests.merged.profdata -rw-r----- 1 mmoroz eng 1687616 Nov 21 13:39 swiftshader_unittests.merged.profdata $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=swiftshader_unittests_coverage_merged -instr-profile=swiftshader_unittests.merged.profdata out/coverage/swiftshader_unittests real 0m8.662s user 0m7.004s sys 0m1.644s
,
Nov 21 2017
It also seems to produce slightly better (?) data. The results of run with %p show higher coverage. Let's use that in the script as well (I'll add a comment to code review).
,
Nov 21 2017
llvm-cov execution time for other unittests: $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=cc_blink_unittests_coverage -instr-profile=cc_blink_unittests.profdata out/coverage/cc_blink_unittests real 88m1.129s user 80m34.296s sys 7m23.848s $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=crypto_unittests_coverage -instr-profile=crypto_unittests.profdata out/coverage/crypto_unittests real 0m13.222s user 0m10.648s sys 0m2.560s $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=mojo_common_unittests_coverage -instr-profile=mojo_common_unittests.profdata out/coverage/mojo_common_unittests real 0m15.571s user 0m12.428s sys 0m3.128s $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=pdf_unittests_coverage -instr-profile=pdf_unittests.profdata out/coverage/pdf_unittests real 14m47.345s user 13m19.092s sys 1m28.148s $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=sql_unittests_coverage -instr-profile=sql_unittests.profdata out/coverage/sql_unittests real 0m14.556s user 0m12.360s sys 0m2.172s $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=breakpad_unittests_coverage -instr-profile=breakpad_unittests.profdata out/coverage/breakpad_unittests real 0m0.869s user 0m0.732s sys 0m0.120s
,
Nov 21 2017
Tried merging .profraw files from small unit tests together: $ time third_party/llvm-build/Release+Asserts/bin/llvm-profdata merge -sparse swiftshader_unittests.*.profraw cc_blink_unittests.profraw crypto_unittests.profraw mojo_common_unittests.profraw pdf_unittests.profraw sql_unittests.profraw breakpad_unittests.profraw -o small_tests_merged.profdata Warning: request a ThreadPool with 4 threads, but LLVM_ENABLE_THREADS has been turned off real 0m38.251s user 0m37.592s sys 0m0.636s And also merged them with large unittests (unit_tests and content_unittests) $ time third_party/llvm-build/Release+Asserts/bin/llvm-profdata merge -sparse swiftshader_unittests.*.profraw cc_blink_unittests.profraw crypto_unittests.profraw mojo_common_unittests.profraw pdf_unittests.profraw sql_unittests.profraw breakpad_unittests.profraw run1_unit.profraw run2_content.profraw -o all_tests_merged.profdata Warning: request a ThreadPool with 5 threads, but LLVM_ENABLE_THREADS has been turned off real 2m52.352s user 2m48.732s sys 0m3.392s It still takes a very reasonable amount of time. Kicking off report generation for all_tests_merged.profdata, let's see how long it takes.
,
Nov 22 2017
I will test it out later, but can we expand the binary_name.%p.profraw idea to run tests in multiple processes? (without test_launcher_jobs=1) Use LLVM_PROFILE_FILE=swiftshader_unittests.%p.profraw time out/coverage/swiftshader_unittests (without test_launcher_jobs=1) to run tests in parallel, and merge all the profraw files from all the processes.
,
Nov 22 2017
Very interesting, using LLVM_PROFILE_FILE=url_unittests.%p.profraw time out/Coverage/url_unittests --test-launcher-jobs=1 to run url_unittests, is generates a large amount of profraw files, is this expected? url_unittests.9682.profraw url_unittests.9688.profraw url_unittests.9692.profraw url_unittests.9685.profraw url_unittests.9689.profraw url_unittests.9693.profraw url_unittests.9686.profraw url_unittests.9690.profraw url_unittests.9694.profraw url_unittests.9687.profraw url_unittests.9691.profraw url_unittests.9695.profraw
,
Nov 22 2017
I think so, the tests seem to be run in ~12 process by default. If I do the following: $ LLVM_PROFILE_FILE=single_process_url_unittests.%p.profraw time out/coverage/url_unittests --test-launcher-jobs=1 --single-process-tests I'm getting only one .profraw file generated.
,
Nov 22 2017
FTR, regarding my c#13. Generation of HTML report for bunch of tests merged together took ~8 hours: 2:~/Projects/new/chromium/src$ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=all_tests_merged_coverage -instr-profile=all_tests_merged.profdata out/coverage/unit_tests -object out/coverage/breakpad_unittests out/coverage/content_unittests out/coverage/mojo_common_unittests out/coverage/sql_unittests out/coverage/cc_blink_unittests out/coverage/crypto_unittests out/coverage/pdf_unittests out/coverage/swiftshader_unittests warning: 9 functions have mismatched data real 479m42.069s user 434m59.592s sys 44m37.144s Which is expected as per c#6.
,
Nov 22 2017
Again, the report generated after running tests in multi process mode shows bigger coverage numbers compared to the single process mode. However, the different seems to come from different places such as //base and others. while the coverage in //url is exactly the same for both runs. ------------------------------ $ time third_party/llvm-build/Release+Asserts/bin/llvm-profdata merge -sparse single_process_url_unittests.164330.profraw -o single_process_url_unittests.164330.profdata real 0m0.377s user 0m0.316s sys 0m0.044s $ time third_party/llvm-build/Release+Asserts/bin/llvm-profdata merge -sparse url_unittests.*.profraw -o url_unittests_merged.profdata Warning: request a ThreadPool with 6 threads, but LLVM_ENABLE_THREADS has been turned off real 0m2.709s user 0m2.600s sys 0m0.104s $ ls -l *url_uni*.profdata -rw-r----- 1 mmoroz eng 1904664 Nov 22 12:48 single_process_url_unittests.164330.profdata -rw-r----- 1 mmoroz eng 2553616 Nov 22 12:48 url_unittests_merged.profdata $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=single_process_url_unittests_coverage -instr-profile=single_process_url_unittests.164330.profdata out/coverage/url_unittests real 0m15.409s user 0m12.208s sys 0m2.928s $ time third_party/llvm-build/Release+Asserts/bin/llvm-cov show -format=html -output-dir=url_unittests_coverage -instr-profile=url_unittests_merged.profdata out/coverage/url_unittests real 0m14.222s user 0m11.436s sys 0m2.788s
,
Nov 22 2017
Also posting here sizes of HTML reports generated, as I'm going to delete them from my machine: $ du -hs *_coverage 2.8G all_tests_merged_coverage 22M breakpad_unittests_coverage 1.2G cc_blink_unittests_coverage 2.8G content_coverage 139M crypto_unittests_coverage 137M mojo_common_unittests_coverage 641M pdf_unittests_coverage 138M single_process_url_unittests_coverage 172M sql_unittests_coverage 123M swiftshader_unittests_coverage 2.8G unit_and_content_coverage 138M url_unittests_coverage
,
Nov 29 2017
,
Dec 1 2017
Still running out of space with other tests as well. Will think a bit more. In the worst case, I'll end up with using a GCE instange with 2 TB HDD or something like that: mmoroz@mmoroz2:~/Projects/new/chromium/src/covdata$ du -hs * 16G cc_unittests 487M audio_unittests 45M courgette_unittests 27G gpu_unittests 5.4G headless_unittests 45G media_blink_unittests 492M media_mojo_unittests 60M media_service_unittests 16G media_unittests 67G net_unittests
,
Dec 11 2017
Posting another comment for myself regarding the build time: $ time ./go.sh courgette_unittests gpu_unittests headless_unittests audio_unittests media_unittests media_blink_unittests media_mojo_unittests media_service_unittests net_unittests services_unittests service_manager_unittests skia_unittests storage_unittests blink_heap_unittests wtf_unittests blink_common_unittests angle_unittests pdfium_unittests accessibility_unittests gfx_unittests gl_unittests keyboard_unittests snapshot_unittests views_unittests wm_unittests url_unittests breakpad_unittests content_unittests mojo_common_unittests sql_unittests unit_tests cc_blink_unittests crypto_unittests pdf_unittests swiftshader_unittests real 0m0.004s user 0m0.000s sys 0m0.000s mmoroz@code-coverage:~/chromium/src$ time ./go.sh ninja: Entering directory `out/Default' [1061/1061] LINK ./courgette_unittests ninja: Entering directory `out/Default' [5154/5154] LINK ./gpu_unittests ninja: Entering directory `out/Default' [16827/16827] LINK ./headless_unittests ninja: Entering directory `out/Default' [67/67] LINK ./audio_unittests ninja: Entering directory `out/Default' [258/258] LINK ./media_unittests ninja: Entering directory `out/Default' [513/513] LINK ./media_blink_unittests ninja: Entering directory `out/Default' [14/14] LINK ./media_mojo_unittests ninja: Entering directory `out/Default' [25/25] LINK ./media_service_unittests ninja: Entering directory `out/Default' [746/746] LINK ./net_unittests ninja: Entering directory `out/Default' [1446/1446] LINK ./services_unittests ninja: Entering directory `out/Default' [310/310] LINK ./service_manager_unittests ninja: Entering directory `out/Default' [25/25] LINK ./skia_unittests ninja: Entering directory `out/Default' [84/84] LINK ./storage_unittests ninja: Entering directory `out/Default' [212/212] LINK ./blink_heap_unittests ninja: Entering directory `out/Default' [40/40] LINK ./wtf_unittests ninja: Entering directory `out/Default' [10/10] LINK ./blink_common_unittests ninja: Entering directory `out/Default' [93/93] LINK ./angle_unittests ninja: Entering directory `out/Default' [563/563] LINK ./pdfium_unittests ninja: Entering directory `out/Default' [15/15] LINK ./accessibility_unittests ninja: Entering directory `out/Default' [152/152] LINK ./gfx_unittests ninja: Entering directory `out/Default' [17/17] LINK ./gl_unittests ninja: Entering directory `out/Default' [38/38] LINK ./keyboard_unittests ninja: Entering directory `out/Default' [4/4] LINK ./snapshot_unittests ninja: Entering directory `out/Default' [172/172] LINK ./views_unittests ninja: Entering directory `out/Default' [19/19] LINK ./wm_unittests ninja: Entering directory `out/Default' [27/27] LINK ./url_unittests ninja: Entering directory `out/Default' [33/33] LINK ./breakpad_unittests ninja: Entering directory `out/Default' [531/531] LINK ./content_unittests ninja: Entering directory `out/Default' [28/28] LINK ./mojo_common_unittests ninja: Entering directory `out/Default' [14/14] LINK ./sql_unittests ninja: Entering directory `out/Default' [9279/9280] LINK ./unit_tests ^C ninja: build stopped: interrupted by user. ninja: Entering directory `out/Default' ^C real 491m3.915s user 4354m7.076s sys 104m16.772s
,
Dec 12 2017
(another comment for myself) the second part of that build log: c$ time ./go2.sh ninja: Entering directory `out/Default' [1/1] LINK ./unit_tests ninja: Entering directory `out/Default' [6/6] LINK ./cc_blink_unittests ninja: Entering directory `out/Default' [19/19] LINK ./crypto_unittests ninja: Entering directory `out/Default' [8/8] LINK ./pdf_unittests ninja: Entering directory `out/Default' [4/4] LINK ./swiftshader_unittests real 171m11.913s user 244m54.008s sys 1m19.872s
,
Dec 22 2017
,
Feb 16 2018
,
Feb 22 2018
Once report mentioned in https://bugs.chromium.org/p/chromium/issues/detail?id=789981#c12 is ready and verified, I'll launch all those steps to work 24/7 in a loop. That should give us at least 1 total code coverage report (tests + fuzzers) per day.
,
Mar 3 2018
,
Mar 9 2018
The bot is up and running, reports available at https://chrome-coverage.googleplex.com/ Further progress regarding adding more targets will be tracked in issue 789981. Plus, we'll have a discussion in issue 818467 about using Chrome Infra recipes and VMs in future, which sounds like a way to go for better scalability.
,
Apr 13 2018
The following revision refers to this bug: https://chrome-internal.googlesource.com/chrome/tools/code-coverage/+/8ae237efed9ed367fc0577a103b6dfd91e8a4cc3 commit 8ae237efed9ed367fc0577a103b6dfd91e8a4cc3 Author: Max Moroz <mmoroz@google.com> Date: Fri Apr 13 01:55:57 2018 |
|||||||
►
Sign in to add a comment |
|||||||
Comment 1 by mmoroz@chromium.org
, Nov 13 2017Cc: baxley@chromium.org liaoyuke@chromium.org infe...@chromium.org