New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.

Issue 870816 link

Starred by 6 users

Issue metadata

Status: Fixed
Owner:
Last visit > 30 days ago
Closed: Aug 22
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: Chrome
Pri: 1
Type: Bug



Sign in to add a comment

Update Failure on Scarlet

Reported by willg...@gmail.com, Aug 3

Issue description

UserAgent: Mozilla/5.0 (X11; CrOS aarch64 10922.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3505.0 Safari/537.36
Platform: Scarlet

Steps to reproduce the problem:
1. Open About Chrome OS
2. Trigger update
3. 

What is the expected behavior?
System updates

What went wrong?
 It downloads and then stops at 100%

Did this work before? N/A 

Chrome version: 70.0.3505.0  Channel: canary
OS Version: 10922.0.0
Flash Version: 

See attached update_engine logs.
 
update_engine.log.pdf
147 KB Download
Components: Internals>Installer
I will add my log
update_engine.log
23.6 KB View Download
Could someone please comment on whether this is a "my end" problem or the update process itself. Would I suffer issues it I run a recovery and then update to Canary?
I did try a recovery and can confirm that I am unable to update to Canary. 

Comment 5 Deleted

Looking through the update_engine logs it looks like the failure is local. The OS appears to download correctly. The error I see is a firmware update failure. Once the firmware update fails, the update is invalidated. Both sets of logs posted hare have the same firmware update failure. The 2 devices are different so they wont have the same firmware and since one is beta and the other is Canary there are other differences as well. Since there aren't lots of reports, this may be isolated incidents????
Components: OS>Firmware
Labels: ReleaseBlock-Dev

Comment 8 Deleted

Labels: M-70
Labels: Hotlist-ConOps-CrOS

Comment 11 Deleted

As additional info to the OP platform. 
Platform
10895.5.0 (Official Build) dev-channel elm
Firmware
Google_Elm.8438.140.0
Channel
Currently on canary

Comment 13 Deleted

Comment 14 Deleted

Hi, Is this still a Dev blocker? If so, please update here with the plan to address this. Thanks.
Any updates?? 
Updating still fails.
I put the rootfs into rw mode and removed the /root/.force_firmware_update file and added a /root/.leave_firmware_alone The update failed because the update requires the updated firmware. 
I next did another recovery back to stable. I tried an update to Canary but it failed again (logs attached) The updated is completed except the firmware. The problems begin at line 576. Once the firmware fails the update is rolled back.
I then updated to Dev Channel which worked as expected. 
I don't expect this is widespread but it is absolutely a problem for my device. I am willing to do whatever is necessary to track this down.
I'm not going to try to post a log, but I'm experiencing the exact same issue on my Acer R13. I tried to update while on the Canary channel, then I tried to update *to* Canary from Dev, Beta, and stable. All failed.
I would post the log.

/var/log/update_engine.log

I just copy the logs to ~/Downloads and then upload them.
This might be a duplicate of  crbug.com/872394 , which is now closed. Can you pull in recent changes and see if this is still happening?
This report preceded  crbug.com/872394  by 5 days. It is still an issue on ELM (Acer R13) 
update_engine.log
941 KB View Download
Echoing the above, still seeing this issue on ELM
crrev.com/c/1176112 should hopefully fix the issue.
Many thanks. Do you have any idea when this will happen? Anxious to get back to Canary so I can find other bugs. ;-) 
This is the longest I have been away from Canary in two years.
It's moving through the submit process already. I'm not sure when the next CQ run will be, but I'd expect it to submit sometime this evening if everything goes well.
Ya I watched the crrev link. Thanks again. 
Project Member

Comment 27 by bugdroid1@chromium.org, Aug 16

The following revision refers to this bug:
  https://chromium.googlesource.com/chromiumos/platform/mosys/+/2f0e8a8a99b1ea9c0fa697caf01bc70d93a64863

commit 2f0e8a8a99b1ea9c0fa697caf01bc70d93a64863
Author: Samantha Miller <samanthamiller@google.com>
Date: Thu Aug 16 07:30:10 2018

mosys: Add pipe to fix chromeos-firmwareupdate on elm

chromeos-firmwareupdate on elm needs pipe system call.
Add it to the arm seccomp filter.

BUG= chromium:870816 
TEST=none

Change-Id: I8846254b524ac078ae8a304324e7e039a01b762e
Reviewed-on: https://chromium-review.googlesource.com/1176112
Commit-Ready: Samantha Miller <samanthamiller@google.com>
Tested-by: Samantha Miller <samanthamiller@google.com>
Reviewed-by: Jason Clinton <jclinton@chromium.org>

[modify] https://crrev.com/2f0e8a8a99b1ea9c0fa697caf01bc70d93a64863/seccomp/mosys-seccomp-arm.policy

Labels: -Pri-2 Pri-1
Owner: samanthamiller@chromium.org
Status: Assigned (was: Unconfirmed)
I don't know if the changes made it to the latest Chrome OS update but the latest attempt at an update failed.  Partial log is as follows:

Updated kernel 2 with Successful = 0 and NumTriesLeft = 6
Checking /mnt/stateful_partition/unencrypted permission.
Permission is ok.
Starting firmware updater (//usr/sbin/chromeos-firmwareupdate --mode=autoupdate)
Command: //usr/sbin/chromeos-firmwareupdate --mode=autoupdate
Starting Google_Elm firmware updater v4 (autoupdate)...
 - Updater package: [Google_Elm.8438.140.0 / EC:elm_v1.1.4818-e120dd6]
 - Current system:  [RO:Google_Elm.8438.19.0 , ACT:Google_Elm.8438.140.0 / EC:elm_v1.1.4818-e120dd6]
 - Write protection: Hardware: ON, Software: Main=ON
 * invoke: flashrom -p host -r _current/bios.bin
Firmware update available: Google_Elm.8438.140.0. 
 * invoke: flashrom -p host --fast-verify -w bios.bin -i RW_SECTION_A
Child process did not exit normally.
Failed to read VBNV from flash.
mosys invocation was: ["nvram", "vboot", "read"]
Application error: Subcommand execution finished with error -1
waitpid() or mosys error
Parameter fw_try_next is read-only
ERROR: cannot SET crossystem property: fw_try_next=A
ERROR: Execution failed: ./updater4.sh (error code = 1)
Finished after 7 seconds.
Failed Command: //usr/sbin/chromeos-firmwareupdate --mode=autoupdate - Exit Code 1
Firmware update failed (error code: 1).
Rolling back update due to failure installing required firmware.
Successfully updated GPT with all settings to rollback.
PostInstall Failed
A manual try at a firmware update
localhost / # chromeos-firmwareupdate -d -m autoupdate
 (DEBUG) /tmp/tmp.wC4Mi0WAuU/bin/crossystem works fine.
 (DEBUG) Using programs in /tmp/tmp.wC4Mi0WAuU/bin.
 (DEBUG) cros_acquire_lock: Set lock file to /tmp/chromeos-firmwareupdate-running.
 (DEBUG) No PD firmware bundled in updater, ignored.
Starting Google_Elm firmware updater v4 (autoupdate)...
 - Updater package: [Google_Elm.8438.140.0 / EC:elm_v1.1.4818-e120dd6]
 - Current system:  [RO:Google_Elm.8438.19.0 , ACT:Google_Elm.8438.140.0 / EC:elm_v1.1.4818-e120dd6]
 (DEBUG) preserved HWID as: ELM D4A-C2A-G3A-A2B.
 - Write protection: Hardware: ON, Software: Main=ON
 (DEBUG) cros_check_compatible_platform image=Google_Elm platform=Google_Elm
 (DEBUG) No keysets folder.
 (DEBUG) preparing main firmware images...
 (DEBUG) trying to read main firmware from system EEPROM...
 * invoke: flashrom -p host -r _current/bios.bin
 (DEBUG) cros_compare_file(_current/main/VBLOCK_B, _target/main/VBLOCK_B): 2cd0888bc3b5939d0177bbcb6c49092a, 2cd0888bc3b5939d0177bbcb6c49092a
Firmware update available: Google_Elm.8438.140.0. 
 (DEBUG) preparing main firmware images...
 (DEBUG) prepare_main_current_image: Use existing cache.
 (DEBUG) cros_compare_file(_gk1_strip, _gk2_strip): b836ad2b24e871b05064340dbbec984e, b836ad2b24e871b05064340dbbec984e
 (DEBUG) tpm_fwver: 65537
 (DEBUG) data_key_version: 1
 (DEBUG) firmware_version: 1
 (DEBUG) fw_key_version: 65537
 (DEBUG) preamble_flags: 0
 (DEBUG) invoking: crosfw_update_main(RW_SECTION_A)
 * invoke: flashrom -p host --fast-verify -w bios.bin -i RW_SECTION_A
 (DEBUG) cros_compare_file(_current/main/RW_LEGACY, _target/main/RW_LEGACY): 2fdd6851b32ae931637d4845c037b550, 2fdd6851b32ae931637d4845c037b550
 (DEBUG) RW_LEGACY not changed.
Firmware update (autoupdate) completed.
The commit Samantha made hasn't landed yet:

https://storage.googleapis.com/chromium-find-releases-static/2f0.html#2f0e8a8a99b1ea9c0fa697caf01bc70d93a64863

You can use the commit tool found in https://omahaproxy.appspot.com/ to check when it lands.
Thanks for the update.
I'm not really familiar with the update/release process. When would we know if this is fixed? Alternatively, is there an elm device I can lock? That way I could build the image manually and load it to test.
This likely didn't completely fix the issue, but I expect crrev.com/c/1178966 will. This at least solves the problem on hana.
If someone could give me a little direction, I would be happy to apply changes manually. 
If you're able to modify /usr/share/policy/mosys-seccomp.policy, you can just add lines to that. I haven't been able to do that, though. I've been testing by emerging mosys and doing 'cros deploy' onto the target machine. I'm not sure what the equivalent would be for you, but maybe that'll point you in the right direction?
/usr/share/policy/mosys-seccomp.policy does not exist on my machine. What would I need to add?
On the machine you're running the firmware update on? It should be there.

The things to add would just be the changes in the CLs. Since Scarlet is arm, mosys/seccomp/mosys-seccomp-arm.policy is installed to /usr/share/policy/mosys-seccomp.policy. The recent lines have been:

pipe: 1
statfs: 1
prctl: 1
sigreturn: 1

This just whitelists those system calls so mosys is allowed to use them.
localhost / # ls -l /usr/share/policy/
total 48
-rw-r--r--. 1 root root  824 Jul 26 01:37 apk-cache-cleaner-seccomp.policy
-rw-r--r--. 1 root root 1210 Jul 26 01:37 arc-adbd-seccomp.policy
-rw-r--r--. 1 root root 1493 Jul 26 01:45 arc-oemcrypto-seccomp.policy
-rw-r--r--. 1 root root  917 Jul 26 01:21 conntrackd-seccomp.policy
-rw-r--r--. 1 root root 1424 Jul 26 01:35 cupsd-seccomp.policy
-rw-r--r--. 1 root root  817 Jul 26 01:35 cupstestppd-seccomp.policy
-rw-r--r--. 1 root root 1083 Jul 26 01:37 ippusb-manager-seccomp.policy
-rw-r--r--. 1 root root 1447 Jul 26 01:32 ippusbxd-seccomp.policy
-rw-r--r--. 1 root root  606 Jul 26 01:35 lpadmin-seccomp.policy
-rw-r--r--. 1 root root 1591 Jul 26 01:38 midis-seccomp.policy
-rw-r--r--. 1 root root 1270 Jul 26 01:50 nfqueue-seccomp.policy
-rw-r--r--. 1 root root  909 Jul 26 01:26 sslh-seccomp.policy
localhost / # find / -name *.policy
/opt/google/touch/policies/rmi4update.update.policy
/opt/google/touch/policies/wacom_flash.update.policy
/opt/google/touch/policies/rmi4update.query.policy
/opt/google/touch/policies/wacom_flash.query.policy
/opt/google/cros-disks/avfsd-seccomp.policy
/opt/google/containers/android/rootfs/root/system/etc/seccomp_policy/mediacodec-seccomp.policy
/opt/google/containers/android/rootfs/root/system/etc/seccomp_policy/mediaextractor-seccomp.policy
/opt/google/imageloader/imageloader-helper-seccomp.policy
/opt/google/imageloader/imageloader-seccomp.policy
/opt/google/mtpd/mtpd-seccomp.policy
/usr/share/policy/ippusbxd-seccomp.policy
/usr/share/policy/arc-adbd-seccomp.policy
/usr/share/policy/ippusb-manager-seccomp.policy
/usr/share/policy/arc-oemcrypto-seccomp.policy
/usr/share/policy/conntrackd-seccomp.policy
/usr/share/policy/cupstestppd-seccomp.policy
/usr/share/policy/apk-cache-cleaner-seccomp.policy
/usr/share/policy/lpadmin-seccomp.policy
/usr/share/policy/midis-seccomp.policy
/usr/share/policy/nfqueue-seccomp.policy
/usr/share/policy/cupsd-seccomp.policy
/usr/share/policy/sslh-seccomp.policy
Maybe that's why your attempt in #30 doesn't have the same failure. You shouldn't be seeing these mosys errors if you don't have the seccomp policy.
#30 was a separate firmware update not part of the OS update.
The update_engine.log in the first post here shows the same failure.
https://bugs.chromium.org/p/chromium/issues/detail?id=872394 shows the same. There is another link somewhere that references several platforms with firmware update issues elm, scarlet, hana and one more if I remember. The OS update does fail at the firmware update stage. The firmware update is obviously necessary since I tried changing the /root/.force . . . to /root/.leave . . .  and it failed with the reason something like "firmware needed". There is at least one other elm machine reporting the failure here. 
I would love to just get back to working nut that doesn't help if this problem moves to other releases. I would really like to help track down the issue. Tell me what I can do to help.
update_engine logs from the latest attempt.
update_engine.log
1.7 MB View Download
I imagine crrev.com/c/1178966 will fix the issue once it goes in. I don't have an elm to test on, so I can't promise that it will, but it fixes exactly the same issue on other boards (specifically hana). Now, we're just waiting for that to pass through the CQ.

 crbug.com/872394  is tracking the same issue, if you want more information.

I see several release numbers in those logs. There's 10895.5.0, 10962.0.0, 10965.0.0, 10968.0.0, 10971.0.0, 10974.0.0. The first should have neither the seccomp policy nor the problem. The rest should have both the policy file and this issue.
Are you saying that mosys-seccomp.policy file is part of the OS update? If I were to copy that file to the policy folder, would that be an unofficial test? 
It looks like the policy file is part of the update. It was definitely added to Chrome OS between those versions (10924.0.0, to be exact).

I don't really know how the updates are working, but if you can replace whatever policy file that's coming from the updated versions with your own, that should help. I generally can't write to that part of the system because it's marked as read-only.

You're free to try to play around with things, but I expect the errors to go away once that CL makes it through CQ, so that's my main focus right now.
I saw the change has been pushed. 
ChromeOS bot Change has been successfully pushed.
I could put the fs into rw mode but I guess I will wait. 
Just to update this, I did change the RW status on the file system and added the file. The update failed again but I expected that. The updater would be looking at the file in the other slot, not on the current install. I will do a recovery and put the fs back to RO.
I'm on the dev channel on elm. Once the fix is working, what will average joes like me need to do before attempting to update? Power wash? Switch back to stable channel?
If you enabled Canary, it should just update. Going up the chain (stable -> beta -> dev -> canary) you just update. Going the other way will force a powerwash.
Right, but I'm just wondering if there's some extra step I need to take in order to get the benefit of whatever is being fixed by the devs.
If the next update fixes the firmware updater, it happens as part of the update. Most updates don't include a firmware update. The firmware doesn't change that often. 
If the patch was included in today's Chrome update, it did not fix the issue. The firmware updater still failed. This is an EVE device Acer R13
Logs attached.
update_engine.log
334 KB View Download
10981.0.0 update failed as well.

update_engine.log
70.1 KB View Download
The CL hasn't been able to pass through the CQ yet.
Thanks for the update. Any idea why it has not passed through?
Something else was causing the CQ to fail Friday and then the CL didn't get marked as innocent so didn't automatically get sent through again. This happens reasonably regularly.
Thanks. I shall wait impatiently patient then.

Do you know why this bug took so long to get attention? When I report bugs, they usually get triaged and assigned reasonably quickly. I wasn't sure this was a bug or my machine and started a post on the Chromium OS group. At some point, Will G posted his bug to the mailing list and I added my info to this bug. On the 7th I got some help from Mike Frysinger who "slightly triaged it", but it still took a while for it to be picked up. I do understand everyone overloaded, vacation time etc.but if there are things I should be doing differently.

I think everyone's been busy? I'm just an intern, so I don't know much about how bugs get assigned. I just deal with the bugs that get sent to me.

I think the early logs didn't really point to the problem. The first clue that this was mosys related was that it looked a lot like  crbug.com/872394 , which included the mosys errors. That's how I originally found the bug. The proof that this was actually the problem for this bug came in comment #21. I don't know why your earlier logs don't have the mosys errors.

I can't say how different this is than normal. As stated, I'm just an intern.
Thanks.
For reference update for ELM 10987.0.0/70.0.3524.2 failed. Logs attached
update_engine.log
379 KB View Download
Status: Fixed (was: Assigned)
Using version 10991.0.0 on elm, I can confirm that the firmware updater now completes. I'm closing this bug.
Will it update off the broken build that I originally reported? Or will I need to get off this channel and than update back? Cause it isn't updating...
It doesn't look like 10991 is live yet. 
As I posted that 10991 wasn't available, it did become available. I had moved back to dev. I just rebooted int current Canary. I would try again.
Samantha,
Thank you for your efforts. 
Confirmed working on ELM
Works now, I had to bounce down to Dev and update back to Canary for it to work. Thanks.
Chell update failed with same symptoms from 70.0.3538.41
Post your /var/log/update_engine.log That will help to see if it is a similar issue

Sign in to add a comment