In the following code in webrtc::VideoQualityObserver::OnDecodedFrame:
rtc::Optional<int> avg_interframe_delay =
interframe_delays_.Avg(kMinFrameSamplesToDetectFreeze);
// Check if it was a freeze.
if (num_frames_decoded_ > kMinFrameSamplesToDetectFreeze &&
interframe_delay_ms >=
std::max(3 * *avg_interframe_delay,
*avg_interframe_delay + kMinIncreaseForFreezeMs));
The optional avg_interframe_delay can be undefined, but is still accessed in the call to std::max. This leads to an out-of-bounds stack read, and an assert in a debug build.
To reproduce:
1) Apply new.patch to a webrtc tree and build video_replay
2) Call video_replay --input_file ./oob with the attached files in the same directory
I think this issue probably does not have a security impact, but I am filing it as a security issue just in case.
|
Deleted:
new.patch
22.4 KB
|
|
Deleted:
oob_rtpdump
2.1 MB
|
|
Deleted:
oob_config
1.9 KB
|
Comment 1 by mea...@chromium.org
, Jun 5 2018Mergedinto: 847809
Status: Duplicate (was: Unconfirmed)