Build a model that estimates power usage from CPU usage and idle wakeups. |
|||
Issue descriptionBasically, perform an analysis using BattOr to get real power numbers from test pages that stress CPU usage, and idle wakeups. Then, build a model that lets us estimate real power draw based on CPU usage and idle wakeups.
,
Dec 16 2016
Oh, yes, sorry, did not mean to start the trek down the rabbit hole. Several thoughts: * If we can't model this using a *very* simple linear equation, I don't think it's worth doing. I'm sure there are PhD students who spend years working on this. * I specifically wanted to know an approximation for the cost of CPU usage and idle-wakeups, because those are easily measurable by traces. Powering up a 3G radio [which will hit 3W+ power draw] is obviously going to use a lot of power, and will be very hard to nail down. I was mostly hoping to get an estimate like: * waking up the CPU costs 0.22J * running a sandy lake core i7 [or whatever we have on the perf bots] at 10% for 1s costs 0.8J. If this would take more than a couple of days, I don't think we should do it and we should just keep using BattOr has "source of truth". * The main goal is to get some granularity into where that power draw is going in the lab * And give us some numbers we can start collecting from the wild. [although maybe you're right that network radio is the most relevant measurement...]
,
Dec 16 2016
SGTM, thanks for clarifying. For the estimates you're thinking of gathering, I think we will want to sample a reasonable number of devices, to make sure the numbers are at least approximately comparable.
,
Dec 29 2016
Not much to say, other than that I generally agree with the sentiment that we should know the relative cost of performing different actions like waking up the CPU / powering up the 3G radio. This seems like it's incredibly important in prioritizing power work.
,
Jan 16
,
Jan 16
|
|||
►
Sign in to add a comment |
|||
Comment 1 by tdres...@chromium.org
, Dec 16 2016