webkit_layout_test failures in linux_chromium_rel_ng |
|||||||||||||||||||||
Issue descriptionAppears to have been flaky for a while now (before PST sheriff checked in). Haven't been able to track down the offending patch. https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng?numbuilds=200 example: https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng/builds/450264 THE FOLLOWING ARE EXAMPLES OF FLAKY FAILURES: Unexpected Failures: * external/wpt/css/vendor-imports/mozilla/mozilla-central-reftests/variables/variable-declaration-18.html * http/tests/misc/non-utf8-header-name.php * http/tests/security/contentSecurityPolicy/directive-parsing-03.html * http/tests/security/contentSecurityPolicy/source-list-parsing-04.html * inspector/sources/debugger/rethrow-error-from-bindings-crash.html * virtual/mojo-loading/http/tests/inspector/network/network-filters.html * virtual/mojo-loading/http/tests/security/contentSecurityPolicy/directive-parsing-03.html * virtual/mojo-loading/http/tests/security/contentSecurityPolicy/source-list-parsing-04.html * virtual/mojo-loading/http/tests/security/document-domain-canonicalizes.html
,
May 10 2017
Issue 720513 has been merged into this issue.
,
May 10 2017
,
May 10 2017
bug 719694 covers rethrow-error-from-bindings-crash.html
,
May 10 2017
After duping, issue 719837 covers rethrow-error-from-bindings-crash.html. Apparently that was caused by the V8 roll https://chromium.googlesource.com/chromium/src/+/3b2a2acc04df20808235a2ef415f04290c80a383 , maybe that caused the other issues too?
,
May 10 2017
,
May 10 2017
Hmm, possibly! Should we revert the V8 roll? On a side note, are V8 rolls weekly? Any side effects of reverting I should know about?
,
May 10 2017
It's possible that someone added a dependency on something in the roll. But the CQ for the roll should us inform of that. I think trying to revert would be good.
,
May 10 2017
Talked offline. Will revert v8 to the version before 6.0.181 per https://codereview.chromium.org/2869723005. Hopefully that will help.
,
May 10 2017
Fix is on the way and it's easier to roll fix probably. https://chromium-review.googlesource.com/c/501351/
,
May 10 2017
Note, the roll that includes the V8 side revert is https://codereview.chromium.org/2868403002/ - I'm fine with first reverting V8 and then directing that roll through. Somebody might need to redo that roll manually though and rebase it as the auto-roller won't do that. If you want the revert to go through first you should temporarily close rolling at https://v8-roll.appspot.com/ - uncheck the CQ on the current roll and wait for the revert to land.
,
May 10 2017
The current roll can also just be closed after the revert roll landed. The auto-roller will create a new roll again.
,
May 10 2017
rethrow-error-from-bindings-crash.html is not related. I couldn't reproduce these tests failures locally. And based on test results, tests are flaky: external/wpt/css/vendor-imports/mozilla/mozilla-central-reftests/variables/variable-declaration-18.html from Apr 28. http/tests/security/contentSecurityPolicy/directive-parsing-03.html from 5/10/2017 7:16:49 AM PDT http/tests/security/contentSecurityPolicy/source-list-parsing-04.html - the same. culprit for both: https://chromium.googlesource.com/chromium/src/+log/4f4a5a7103aa313f038ce2990cbdad50ae2ef9c1%5E..53f2bbb667f8807541a82e65688c4e7612ab6c59?pretty=fuller there is another v8 roll in culprit but I couldn't see how it can be related. So I'll roll V8 to this position where rethrow-error-from-bindings-crash.html fixed and probably something else should be reverted to fix other tests. (I'll wait before rolling to check my theory).
,
May 10 2017
Issue 720548 has been merged into this issue.
,
May 10 2017
,
May 10 2017
Hm, apparently we don't run webkit_layout_tests for v8 rolls. Filed issue 720623 for that.
,
May 10 2017
On linux_chromium_rel_ng, the flaky tests fail with *any* patch (running all tests), but often succeed without patch (running just failed tests). This is currently blocking any blink patch from landing through CQ. Can we unblock CQ first?
,
May 11 2017
Is this patch getting committed? It looks like it is ready to go and just needs to be submitted.
,
May 11 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/4184757eab3cd7fd4ff3e4b565f325e096bbf97b commit 4184757eab3cd7fd4ff3e4b565f325e096bbf97b Author: apacible <apacible@chromium.org> Date: Thu May 11 04:13:14 2017 Revert V8 to the version rolled before 6.0.181. This version is suspected to cause some webkit_layout_test failures. See crbug for more details. BUG= 720511 TBR=thakis@chromium.org Review-Url: https://codereview.chromium.org/2868373002 Cr-Commit-Position: refs/heads/master@{#470803} [modify] https://crrev.com/4184757eab3cd7fd4ff3e4b565f325e096bbf97b/DEPS
,
May 11 2017
I went ahead and clicked the submit button. Hopefully this fixes the webkit_layout_test failures in linux_chromium_rel_ng that have been stopping CLs from landing for the last 2 days.
,
May 11 2017
Is there a good overview page were we can see the impact of this? And judge if it got better?
,
May 11 2017
WebKit Linux Trusty is still red after #20. https://build.chromium.org/p/chromium.webkit/builders/WebKit%20Linux%20Trusty/builds/26372
,
May 11 2017
That's a build with V8 reverted. Then it wasn't V8, maybe? Since there are only a few tests failing, how about skipping them for now?
,
May 11 2017
,
May 11 2017
Looks like the trybot improved. The continuous release bot still tells a slightly different story: https://build.chromium.org/p/chromium.webkit/builders/WebKit%20Linux%20Trusty?numbuilds=200 Failures started here: https://build.chromium.org/p/chromium.webkit/builders/WebKit%20Linux%20Trusty/builds/26349 There V8 rolled to 6.0.189, i.e. much later already than the suspected culprit here. The revert back to 6.0.180 doesn't seem to change anything: https://build.chromium.org/p/chromium.webkit/builders/WebKit%20Linux%20Trusty/builds/26372 The first failing build has a few more things in the blame list. Maybe there's another culprit?
,
May 11 2017
Can the webrtc roll impact this? https://codereview.chromium.org/2872293002 It didn't run webkit tests in CQ, while the V8 roll did.
,
May 11 2017
I'm talking specifically about: https://build.chromium.org/p/chromium.webkit/builders/WebKit%20Linux%20Trusty/builds/26349
,
May 11 2017
I don't think the webkit_layout_tests involve WebRTC at all.
,
May 11 2017
Re comment 22: https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng?numbuilds=200 shows that the cq is functional again. I think this bug here is fixed. Up to sheriffs if they want to open a new bug for the continous bot or do this without a bot.
,
May 11 2017
Hm, I reverted v8 back to where apacible reverted it back to, but cq is still busted even after that: https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng/builds/451831
,
May 11 2017
From a recent run (but I've seen more failures on other runs): Unexpected Failures: * http/tests/security/contentSecurityPolicy/directive-parsing-03.html * http/tests/security/contentSecurityPolicy/source-list-parsing-04.html * virtual/enable_asmjs/http/tests/asmjs/asm-warnings.html The last one is failing cause the -expected file is missing. It was deleted in https://codereview.chromium.org/2865113006 . I wonder if the other -expected files saw similar changes.
,
May 11 2017
Hm, apparently not: https://chromium.googlesource.com/chromium/src/+log/master/third_party/WebKit/LayoutTests/http/tests/security/contentSecurityPolicy/directive-parsing-03-expected.txt https://chromium.googlesource.com/chromium/src/+log/master/third_party/WebKit/LayoutTests/http/tests/security/contentSecurityPolicy/source-list-parsing-04-expected.txt
,
May 11 2017
https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng/builds/451835 also has these failing: * virtual/mojo-loading/http/tests/misc/non-utf8-header-name.php * virtual/mojo-loading/http/tests/security/contentSecurityPolicy/directive-parsing-03.html * virtual/mojo-loading/http/tests/security/document-domain-canonicalizes.html
,
May 11 2017
Does anyone know how to view layout test failure diffs on the bots? stdout only prints: 09:10:35.639 2041 worker/0 http/tests/security/contentSecurityPolicy/directive-parsing-03.html output stderr lines: 09:10:35.639 2041 Xlib: extension "RANDR" missing on display ":100". 09:10:35.640 2041 [1/1] http/tests/security/contentSecurityPolicy/directive-parsing-03.html failed unexpectedly (text diff) 09:10:35.641 2041 worker/0 http/tests/security/contentSecurityPolicy/directive-parsing-03.html failed: 09:10:35.641 2041 worker/0 text diff
,
May 11 2017
I tried reverting https://codereview.chromium.org/2865113006 but the diff doesn't apply. We should probably mark all the failing tests as failing in TestExpectations by now to unwedge the cq, and then someone (who?) should debug / bisect locally asynchronously.
,
May 11 2017
Re #36 there is a 'archive_webkit_tests_results' step after the 'webkit_layout_tests' step. Click the 'layout_test_results' link in the 'archive_webkit_results' and you can browse the results.
,
May 11 2017
Thanks! For directive-parsing-03, the diff is "Internal Server Error 9 10 The server encountered an internal error or misconfiguration and was unable to complete your request. 11 12 Please contact the server administrator at [no address given] to inform them of the time this error occurred, and the actions you performed just before this error. 13 14 More information about this error may be available in the server error log." Same for virtual/mojo-loading/http/tests/misc/non-utf8-header-name.php So something broke our test apache server?
,
May 11 2017
Is apacible currently working on this? V8 reverted 4 days back should not stay like that, please strongly consider skipping the 2-3 failing tests. This has often been the appropriate action for webkit tests. The current reverts roll back the 50+ commits of dozens of developers. Added some MTV folks, maybe somebody can have an eye on this. I still very much wonder about the right analysis for comment 27? Before reasoning about flakes I'd try to fix the continuous failures. There are some, they seem reliable and I think the reverts proof that it wasn't the V8 roll's fault. Otherwise, it should be easy to run all this locally, and I'll allocate some time for this tomorrow if it isn't resolved by then. Is there some kind of stress mode for layout tests with which a single test can be run a few 100 times?
,
May 11 2017
,
May 11 2017
> Is there some kind of stress mode for layout tests with which a single test can be run a few 100 times? There's --repeat-each and --iterations, so --repeat-each=100 would do that for example.
,
May 11 2017
Commonalities between http/tests/security/contentSecurityPolicy/directive-parsing-03.html, http/tests/security/contentSecurityPolicy/source-list-parsing-04.html and http/tests/misc/non-utf8-header-name.php (and virtual equivs) appear to be utf-8 in the HTTP header...
,
May 11 2017
I'm updating the TestExpectations file for the layout tests and going through some of the recent build failures (a bit slow to load). Should I be adding both virtual and non-virtual versions of the tests to the expectations? There are some tests that seem to be flaky on one but not the other (recently).
,
May 11 2017
At least the CSP tests aren't utf8, but deliberately malformed headers, which Apache on some shards (sometimes) appears to be rejecting. I filed issue 721388 with an example earlier, before I realized this issue was related. The error from the apache server is AH02430: Response header 'Content-Security-Policy' value of 'script-src 'none'; a\x07aa ; ' contains invalid characters, aborting request, referer: http://127.0.0.1:8000/security/contentSecurityPolicy/directive-parsing-03.html The only refernce I could find with a search for AH02430 was https://bz.apache.org/bugzilla/show_bug.cgi?id=60863
,
May 11 2017
,
May 11 2017
#44. yes, you should add both. I mentioned in #14 that csp tests are very unlikely related to V8 roll and I wasn't able to reproduce it locally.
,
May 11 2017
Based on the results of the "webkit_layout_tests (with patch)" step (which almost always fails) and the "webkit_layout_tests (without patch)" step (which almost always succeeds), I guess some test run before the flaky tests made the server in a special state to return "internal error" for the failed tests. Perhaps restarting http server before each retry could help to avoid such flakiness from breaking bots. Will try.
,
May 11 2017
#47 Thanks, updating.
,
May 11 2017
,
May 11 2017
Thanks! I'll reenable v8 auto-roller as soon as your CL is landed.
,
May 11 2017
https://codereview.chromium.org/2878593003/ restarts http server before retries.
,
May 11 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/362df6f9c79116139df5f89867599d4e0f61bbb9 commit 362df6f9c79116139df5f89867599d4e0f61bbb9 Author: Jennifer Apacible <apacible@chromium.org> Date: Thu May 11 17:52:08 2017 Update TestExpectations for failing LayoutTests. To unblock cq. See crbug for more details. BUG= 720511 TBR=thakis@chromium.org Review-Url: https://codereview.chromium.org/2874213003 . Cr-Commit-Position: refs/heads/master@{#471003} [modify] https://crrev.com/362df6f9c79116139df5f89867599d4e0f61bbb9/third_party/WebKit/LayoutTests/TestExpectations
,
May 11 2017
Unfortunately, restarting server doesn't fix the issue. Some of the tests failed in the first run still failed in the retries after http server restarted.
,
May 11 2017
The bot seems to lie that the failed tests succeeded without patch. Filed bug 721466 for it.
,
May 11 2017
Maybe appache was updated on the affected bots? I reenabled v8-autoroller.
,
May 11 2017
,
May 11 2017
Hmm, are you sure just enabling the V8 roller is the right call? Do we know yet at all if V8 was culpable or not? I'd like to avoid being in the same situation tomorrow again... Has anyone tried to repro any of this locally yet with the revisions in question?
,
May 11 2017
Issue 721525 has been merged into this issue.
,
May 11 2017
Ok, here are my points: - I tried to reproduce these tests locally: - using the same sequence as on try bot, with different build gn args (debug vs release), - running these tests thousands of the time in random order. Without any luck. - blink_trusty_rel is still sometimes fails with the same errors - but V8 was reverted. - first time this error appears with older version of V8. - error is actually internal error of Appache server during run of PHP code I couldn't imagine how V8 roll can affect it. And in any case it looks more like infra problem then real V8 problem. I prefer to roll V8 because since this failure is Apache server error - I'm not sure how it could affect real Chrome users. Yes, it's risky because we still don't know real cause and I hope that current sheriffs are working on fixing.
,
May 11 2017
kozyatinskiy: please don't. the cq is still broken and has been for 2.5 days now. Let's get it working first, and then add more causes for problems.
,
May 11 2017
non-utf8-header-name.php is still failing on CQ even though it's not marked as failing in TestExpectations. Examples: https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng/builds/452242 https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng/builds/452250 apacible is moving the test to Skip instead: https://codereview.chromium.org/2879693002/
,
May 11 2017
let's postpone v8 roll. I'm only worried that when infra would be fixed then we will need to roll huge amount of V8 commits and it would be super hard for further triaging if something will go wrong and as much we blocking V8 roll as more commits we will get then.
,
May 11 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/c3446d5d8214478acc171910990e3562eb0bee50 commit c3446d5d8214478acc171910990e3562eb0bee50 Author: Jennifer Apacible <apacible@chromium.org> Date: Thu May 11 23:08:02 2017 Skip failing LayoutTest. Still failing after updating the expectation to Failure Pass Timeout. Skipping the test for now. See crbug for more details. BUG= 720511 R=thakis@chromium.org Review-Url: https://codereview.chromium.org/2879693002 . Cr-Commit-Position: refs/heads/master@{#471110} [modify] https://crrev.com/c3446d5d8214478acc171910990e3562eb0bee50/third_party/WebKit/LayoutTests/TestExpectations
,
May 12 2017
Any recommendation here? Now that tests are skipped, rolling should be safe again I assume?
,
May 12 2017
Since tests are skipped, I'll slowly get back the V8 rolls and in the mean time try to repro more stuff locally. Will start by getting back Version 6.0.191 that was already landed once.
,
May 12 2017
Reland of latest V8 roll: https://codereview.chromium.org/2878883002/
,
May 12 2017
,
May 12 2017
,
May 12 2017
,
May 12 2017
Preliminary findings: Issue 721700 makes analysis hard because only chromium_linux_rel uses swarming and 6 shards for layout tests, while CI bots and blink trybots don't. Attempts to locally run all this were rather sad. On my machine I get tons of timeouts in revisions supposed to be good. It seems to be very sensitive to the test filters. Most problems don't repro when just running single tests. Is there a cheat sheet with hints beyond https://chromium.googlesource.com/chromium/src/+/master/docs/testing/layout_tests.md somewhere? In https://codereview.chromium.org/2874933006 I bisected into V8 revisions and pinned Chromium to the known-to-be-good revision 4184757eab. I ran chromium_linux_rel 3 times for each patchset to rule out some flakiness. Findings: - https://chromium.googlesource.com/v8/v8/+/918c23643bbd0 clearly causes issue 719837 - https://chromium.googlesource.com/v8/v8/+/dccfe5dbbe853 clearly fixes issue 719837 - There are also flakes, in particular once https://build.chromium.org/p/tryserver.chromium.linux/builders/linux_chromium_rel_ng/builds/452944 on version dccfe5dbbe853 The trybots in patch 5 currently run what just rolled again into chromium (V8 6.0.191) to also test this with an older chromium version that didn't skip the tests yet. I monitored the layout tests of chromium_linux_rel for 3 hours since the reland of 6.0.191 and there's nothing suspicious yet. Whatever it is seems highly flaky and I don't really know how to proceed with unreliable local reproduction.
,
May 12 2017
More findings: - In V8 dccfe5dbbe853 the set of test failed in 1 out of 3 - With V8 c836a95e87a the set of tests seem to reliably fail, 4 out of 4 Still hard to conclude anything. Possible that something between 5c40f75123 and c836a95e87a made those failures more likely. But with my methodology of testing it might take a while to find out. My workflow for https://codereview.chromium.org/2874933006 if anybody is interested: https://paste.googleplex.com/6009542080987136 Making this available again, since I'm off for today. Reducing prio, since the problematic tests are currently skipped.
,
May 15 2017
removing from sheriff queue ("since the problematic tests are currently skipped")
,
May 15 2017
it looks like the check in apache that results in the internal server error is from apache 2.5 which was backported to something after 2.4.23 and before 2.4.25 in any case, both apache version are way never than anything we usually run (2.4.10 or 2.4.7).
,
May 15 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/cf2365ebd905b0dea55dc811a042ae148bdd1338 commit cf2365ebd905b0dea55dc811a042ae148bdd1338 Author: Michael Achenbach <machenbach@chromium.org> Date: Mon May 15 12:53:08 2017 Layout tests: Log apache version BUG= 720511 R=jochen@chromium.org, tandrii@chromium.org Review-Url: https://codereview.chromium.org/2884653002 . Cr-Commit-Position: refs/heads/master@{#471739} [modify] https://crrev.com/cf2365ebd905b0dea55dc811a042ae148bdd1338/third_party/WebKit/Tools/Scripts/webkitpy/layout_tests/port/base.py
,
May 15 2017
The added logging step claims it's always Apache version 2.4.7. In https://codereview.chromium.org/2879153002 I tried to stress the trybots a bit, adding back the tests. Strangely, in patch 3, the first 8 attempts in a row pass, then 7 in a row fail with the full list of unskipped tests failing. My second batch of attempts was around 1 hour later than the first. In the meantime, Chromium ToT changed a bit, but nothing that looks like e.g. changing the test order: https://chromium.googlesource.com/chromium/src/+log/30e66e1ff6..481b55a2f
,
May 15 2017
Now checking if at least the swarming results are consistent across different attempts. Retrying failed shard: https://chromium-swarm.appspot.com/task?id=3625408d27e8e610&refresh=10&show_raw=1 Retries: https://chromium-swarm.appspot.com/task?id=36257523a375fc10&refresh=10&show_raw=1 https://chromium-swarm.appspot.com/task?id=362575646db5b510&refresh=10&show_raw=1 https://chromium-swarm.appspot.com/task?id=36257597190cd810&refresh=10&show_raw=1 This is running, will check back later.
,
May 15 2017
All swarming retries show the same failure. At least there we seem to be consistent. Next exercise would be to repro this locally which so far didn't work out.
,
May 16 2017
Taking back what I said in 78. I triggered more retries now, and they passed: https://chromium-swarm.appspot.com/task?id=362a2588d2b84510&refresh=10&show_raw=1 https://chromium-swarm.appspot.com/task?id=362a1888d50e7510&refresh=10&show_raw=1 https://chromium-swarm.appspot.com/task?id=362a186394c43f10&refresh=10&show_raw=1 I also tried to trigger the same swarming task through swarming client command line in various ways. So far no luck.
,
May 16 2017
Managed to trigger it once on swarming with one test only and let it fail: https://chromium-swarm.appspot.com/task?id=362a5cd339fd7110&refresh=10&show_raw=1 No clue what makes this bot different to any of the others though. Should all have the same image. Maybe the webserver starts up in a bad state from which it doesn't recover?
,
May 16 2017
Retriggered it a bit more. It either always fails: https://chromium-swarm.appspot.com/task?id=362a7961a7812510&refresh=10&show_raw=1 or always succeeds: https://chromium-swarm.appspot.com/task?id=362a794fdd361d10&refresh=10&show_raw=1 Here is my command for luci-py: https://paste.googleplex.com/5967683061284864
,
May 16 2017
ok, so the offending patch (CVE-2016-8743) is part of apache2 (2.4.7-1ubuntu4.15) but not part of apache2 (2.4.7-1ubuntu4.13)
I guess the bots differ by which package they have exactly.
The changelog says:
* WARNING: The fix for CVE-2016-8743 introduces a behavioural change and
may introduce compatibility issues with clients that do not strictly
follow specifications. A new configuration directive,
"HttpProtocolOptions Unsafe" can be used to re-enable some of the less
strict parsing restrictions, at the expense of security.
I guess we need to check for the presence of this patch, and apply that option.
,
May 16 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/3c5da4ae54598b8ebed887a52ee9b2763abb0047 commit 3c5da4ae54598b8ebed887a52ee9b2763abb0047 Author: Tsuyoshi Horo <horo@chromium.org> Date: Tue May 16 14:57:44 2017 Add non-utf8-header-name.php of off-main-thread-fetch virtual test to TestExpectations BUG= 443374 , 722774 , 720511 TBR=falken Review-Url: https://codereview.chromium.org/2887753002 . Cr-Commit-Position: refs/heads/master@{#472105} [modify] https://crrev.com/3c5da4ae54598b8ebed887a52ee9b2763abb0047/third_party/WebKit/LayoutTests/TestExpectations
,
May 16 2017
proposed fix: https://chromium-review.googlesource.com/c/505494
,
May 17 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/d3e42cee3d26f0fc4730bd5d9e861a43e0fc739d commit d3e42cee3d26f0fc4730bd5d9e861a43e0fc739d Author: Jochen Eisinger <jochen@chromium.org> Date: Wed May 17 04:39:50 2017 Feature-detect whether apache2 needs HttpProtocolOptions Unsafe to run After CVE-2016-8743.patch, this option is required to simulate broken headers for layout tests. However, the option that was introduced to allow the old behavior is not supported by previous versions, so we can't just add it by default. R=dpranke@chromium.org, machenbach@chromium.org BUG= 720511 Change-Id: I5bb77874e425b3e1869a6b15658325ec1ebae890 Reviewed-on: https://chromium-review.googlesource.com/505494 Reviewed-by: Dirk Pranke <dpranke@chromium.org> Commit-Queue: Jochen Eisinger <jochen@chromium.org> Cr-Commit-Position: refs/heads/master@{#472326} [modify] https://crrev.com/d3e42cee3d26f0fc4730bd5d9e861a43e0fc739d/third_party/WebKit/Tools/Scripts/webkitpy/layout_tests/port/base.py [modify] https://crrev.com/d3e42cee3d26f0fc4730bd5d9e861a43e0fc739d/third_party/WebKit/Tools/Scripts/webkitpy/layout_tests/port/base_unittest.py [modify] https://crrev.com/d3e42cee3d26f0fc4730bd5d9e861a43e0fc739d/third_party/WebKit/Tools/Scripts/webkitpy/layout_tests/servers/apache_http.py
,
May 17 2017
,
May 18 2017
,
May 18 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/c7a7012ae4f5ea85d17fd9ba9fb0892e080e877c commit c7a7012ae4f5ea85d17fd9ba9fb0892e080e877c Author: machenbach <machenbach@chromium.org> Date: Thu May 18 12:03:11 2017 Unskip previously failing tests The test failures were caused by the impact of: http://crbug.com/723721 BUG= 720511 Review-Url: https://codereview.chromium.org/2879153002 Cr-Commit-Position: refs/heads/master@{#472772} [modify] https://crrev.com/c7a7012ae4f5ea85d17fd9ba9fb0892e080e877c/third_party/WebKit/LayoutTests/TestExpectations
,
May 18 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/9ad27e95181462f0187240acd01a2737139aa94c commit 9ad27e95181462f0187240acd01a2737139aa94c Author: machenbach <machenbach@chromium.org> Date: Thu May 18 12:20:01 2017 Change bug for some layout test expectations BUG= 720511 ,724027 NOTRY=true Review-Url: https://codereview.chromium.org/2894573002 Cr-Commit-Position: refs/heads/master@{#472777} [modify] https://crrev.com/9ad27e95181462f0187240acd01a2737139aa94c/third_party/WebKit/LayoutTests/TestExpectations
,
May 18 2017
Closing this as root cause was issue 723721. Some of the skipped tests got reenabled, the remaining ones migrated to issue 724027. Some of the other issues marked as blocking address various problems detected during the investigation here. Thanks a lot everyone who spent/wasted time with this!
,
Sep 12 2017
|
|||||||||||||||||||||
►
Sign in to add a comment |
|||||||||||||||||||||
Comment 1 by apaci...@google.com
, May 10 2017