Gaps when recording using MediaRecorder API(audio/webm opus)
Reported by
raj...@verbit.ai,
Aug 14
|
||||||||
Issue description
Chrome Version : Version 68.0.3440.106 (Official Build) (64-bit)
URLs (if applicable) :
Other browsers tested: NONE
Add OK or FAIL, along with the version, after other browsers where you
have tested this issue:
Safari:
Firefox:
Edge:
Problem Description
I have an issue with MediaRecorder API (https://www.w3.org/TR/mediastream-recording/#mediarecorder-api).
I'm using it to record the speech from the web page using Chrome, and save it as chunks. I need to be able to play it while and after it is recorded, so it's important to keep those chunks.
Here is the code which is recording data:
navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then(function(stream) {
recorder = new MediaRecorder(stream, { mimeType: 'audio/webm; codecs="opus"' })
recorder.ondataavailable = function(e) {
// Read blob from `e.data`, decode64 and send to sever;
}
recorder.start(1000)
})
The issue is that the WebM file which I get when I concatenate all the parts is corrupted(rarely)!. I can play it as WebM, but when I try to convert it(ffmpeg) to something else, it gives me a file with shifted timings.
At the beginning of file - the audio is the same, but after about 2min WebM file plays with a small delay.
After some research, I found out that it also does not play correctly with the browser's MediaSource API, which I use for playing the chunks.
I tried 2 ways of playing those chunks:
In a case when I just merge all the parts into a single blob - it works fine.
In case when I add them via the sourceBuffer object, it has some gaps (i can see them by inspecting buffered property). 0 - 136.349, 141.388 - 195.439, 197.57 - 198.589
I was able to reproduce the issue on https://jsfiddle.net/96uj34nf/4/
In order to see the problem, click on the "Print buffer zones" button and it will display time ranges. You can see that there are two gaps: 0 - 136.349, 141.388 - 195.439, 197.57 - 198.589
136.349 - 141.388
195.439 - 197.57
So, as you can see there are 5 and 2 second gaps.
Would be happy if someone could shed some light on why it is happening or how to avoid this issue.
Thank you
,
Aug 16
rajesh@ Thanks for the issue. Able to reproduce this issue on Mac OS 10.13.3, Windows 10 and Ubuntu 17.10 on the reported version 68.0.3440.106 and the latest Canary 70.0.3523.0 as per the original comment. This is a Non-Regression issue as this behavior is observed from M-60 chrome builds. Hence marking this as Untriaged for further updates from Dev. Thanks..
,
Aug 17
,
Aug 29
Can we reproduce this error in Chrome before 60?
,
Aug 29
Or maybe we have a workaround?
,
Aug 30
,
Dec 19
Having the same issues, is there any workaround?
,
Jan 4
rajesh@, for further triaging, am I right to understand this: - When concatenating the recorded Blobs and saving to disk, the resulting file can be played back correctly (e.g. with VLC), (since you said "I can play it as WebM"). - It's only when we feed the individual Blobs to a MediaSource that we see the strange gaps? I took a look at the jsfiddle but it doesn't use MediaRecorder, so I'm a bit lost as to how to further triage this issue.
,
Jan 4
,
Jan 8
Hi! On jsfiddle you can see 2 ways of playing a recorded file - as a single blog (in that case it _seems_ to play fine because it jumps over those zones) and as a mediaSource (when it stops playing just before such zone). In case of playing file with VLC - those zones will be jumped over too. There is no media recorder in jsfiddle, because it's not an always reproducible bug, jsfiddle for recorder would not help. But i posted code which i use in the first message. My issue is that 1) I cannot play such file with mediaSource without hacks. 2) The timings are really weird in case of "blob" solution too, and i cannot trust them* So MediaRecorder produces a file for me which i cannot properly use without performing backend transcoding on it. * Try setting time to 135sec and play from there. You'll hear "this study is" at around 2:18. But if you set time at 142 - you'll hear this text at 2:23. It's because of this "jumping over the missing part"
,
Jan 9
Issue 796474 has been merged into this issue.
,
Jan 9
Soem thoughts... Quoting https://crbug.com/796474#c12 : "We took a look and saw that audio frames aren't being dropped, instead multiple frames were generated consecutively with the same timestamp. The total length of these additional frames correspond to the audio gap heard in other browsers/players." MediaRecorder's AudioTrackEncoder encodes audio as fast as it is provided to it, and provides a capture timestamp [1] which then ripples to the WebmMuxer [2] (since both video and audio might come from different timestamp origins, we do the std::max() you see there). I can imagine the audio being provided in batches to the AudioTrackEncoder in situations of CPU overload; in this case the |capture_time_of_first_sample| in [1] should still convey the original capture time (and intended playback time). (If there's an encoded video frame that has for any reason provided a more advanced |most_recent_timestamp_| in [2], then all the encoded audio chunks would be written with the same timestamp on file -- but the jsfiddle in the bug description doesn't have audio, hence I don't think the latter is the case.) If the audio in these batches has a timestamp that does not reflect the capture/intended presentation for some reason, though, then the produced WebM file would have an incongruence between the audio Frame timestamp and the amount of audio samples provided, which players would need to reconcile: seems like the single Blob file player plays it according to the amount of encoded samples provided [3](probably VLC as well), whereas mediaSource doesn't like those really similar timestamps and skips some of them (and ffmpeg seems to dislike those as well). [1] https://cs.chromium.org/chromium/src/content/renderer/media_recorder/audio_track_opus_encoder.cc?sq=package:chromium&dr=CSs&g=0&l=184 [2] https://cs.chromium.org/chromium/src/media/muxers/webm_muxer.cc?sq=package:chromium&dr=CSs&g=0&l=356 [3] https://bugs.chromium.org/p/chromium/issues/detail?id=796474#c14
,
Jan 9
MSE trusts frame timestamps. It must because at its core, MSE provides flexibility to apps to scatter-shot the media timeline with frames (to enable splicing, adaptation, etc scenarios). There are potential work-arounds, but none of them as good IMHO as fixing the original media's timestamps.
,
Jan 11
The NextAction date has arrived: 2019-01-11 |
||||||||
►
Sign in to add a comment |
||||||||
Comment 1 by swarnasree.mukkala@chromium.org
, Aug 15