adapt_reason does not include CPU if the input resolution is reduced to match the encode target
|Project Member Reported by email@example.com, Mar 22 2014||Back to list|
If a webrtc client can scale the input more efficiently than webrtc itself can, then that client may want to observe adapt_reason and, when the adapter downscales due to CPU limitation, also downscale the input to match. However, this creates a paradox: downscaling the input to match the encode resolution causes adapt_reason to be zero. Instead, adapt_reason should continue to indicate that it is CPU-limited, until the CPU monitor determines that webrtc is ready to encode at the previous resolution again. This requires some care, to avoid "ping-pong" between resolutions, if reducing the encode resolution triggers a chain of events that causes other CPU consumption to decrease. Perhaps webrtc should record any decrease in CPU utilization associated with a decrease in input resolution, so that it does not set adapt_reason to zero until a similar increase could be tolerated.
Mar 25 2014,
To clarify, the "ping-pong" problem can in principle also be avoided, for any given scenario, simply by lowering the adapt-up threshold.
Oct 14 2014,
Nov 4 2015,
This bug hasn't been modified for more than a year. Is this still a valid open issue?
Dec 10 2015,
Åsa, Is this still an issue_
Mar 1 2016,
Dec 1 2016,
Åsa, Anything to do here or shall we close?
Aug 4 2017,
[Bulk edit] This issue was created more than a year ago and hasn't been modified the last six months -> archiving. If this is still a valid issue that should be open, please reopen again.
Sign in to add a comment