New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.

Issue 693978 link

Starred by 1 user

Issue metadata

Status: Fixed
Owner:
Closed: Mar 2017
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: Mac
Pri: 2
Type: Bug



Sign in to add a comment

AudioContext resume() promise resolves before currentTime starts updating

Reported by iamcraig...@gmail.com, Feb 19 2017

Issue description

UserAgent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36

Steps to reproduce the problem:
1. Go to this page https://jsfiddle.net/craigiam/0jjkb4y0/1/
2. Press the play button and notice how the timer starts ticking up immediately
3. Press the stop button (which will suspend the AudioContext
4. Press the play button again

What is the expected behavior?
The timer should start ticking up at the same time as the state changes back to running.

According to the spec calling resume should

> Resumes the progression of the BaseAudioContext's currentTime when it has been suspended.

What went wrong?
There is a delay between the time the state changes to “running” and the AudioContext.currentTime starts updating.

Did this work before? N/A 

Chrome version: 56.0.2924.87  Channel: stable
OS Version: OS X 10.12.2
Flash Version: Shockwave Flash 24.0 r0

Seems to be the same in FireFox and Safari so I wonder if this is something to do with the MacOS audio drivers in Sierra. I don’t have other versions of MacOS or OS X to check on.

I attached a screen capture gif of what I am seeing. This is a new late 2016 MacBook Pro if that makes any difference.

 
audiocontextbug.gif
134 KB View Download
Oops. The demo page should actually be https://jsfiddle.net/craigiam/0jjkb4y0/2/
Labels: Needs-Triage-M56

Comment 3 by hdodda@chromium.org, Feb 20 2017

Cc: hdodda@chromium.org
Labels: Needs-Feedback
Tested on mac os 10.12.2 using chrome M56 #56.0.2924.87 and observed as attached in screencast.

Attached screencast for reference.'

@iamcraigcampbell--- Could you please check the attached screencast and let us know if we had missed out anything in reproducing the issue , or provide us the expected result screencast , which would help us in traiging the issue better.

Thanks!
693978.mp4
827 KB View Download
@hdodda thanks for your reply. What you are seeing in your screencast is the expected behavior.

If you open the gif I attached you can see the delay where it says “running” about a half second to a second before the time starts ticking up.

I just realized something that I did not realize when making the ticket. This only happens when your audio output is set to AirPlay to something other than Internal Speakers. Since AirPlay works over your network, this delay is very likely the expected behavior to keep the timecode in sync with what users are hearing. Curiously, when using the `<audio>` tag with AirPlay, the currentTime is not delayed and updates immediately even though what you hear is delayed.

Perhaps rtoy@chromium.org can confirm my suspicion, and then feel free to close this ticket!
Screen Shot 2017-02-20 at 8.30.04 AM.png
58.7 KB View Download
Even if this is the expected behavior. I think it would be better if the promise resolution is delayed to match the audio playing back (when using AirPlay) cause that would allow a web application to show some sort of loading state to communicate the delay to the user before they hear the audio.
Hmm one more strange thing. When using AirPlay, but not calling AudioContext.suspend(), the timecode begins ticking up immediately as well. 
Feel free to update the title of this ticket to “AudioContext resume() promise resolves before currentTime starts updating when using AirPlay”
Components: -Blink Blink>WebAudio

Comment 9 by rtoy@chromium.org, Feb 24 2017

Labels: -Needs-Feedback
Status: Available (was: Unconfirmed)
From the code it does appear that we set the state to running as soon as resume() is called.  We should probably change this so that the state is set when the context actually starts running, which, as you show, can take some time.

Comment 10 by rtoy@chromium.org, Feb 25 2017

Labels: Needs-Feedback
I tried this at home with my AVR that supports Airplay.  Using Chrome 57 (beta) on macOS Sierra and the jsfiddle in c#1, I am unable to reproduce what you show.

I press stop, wait a bit, then press play.  It shows running and current time incrementing right away.  I modified your fiddle a bit to add an oscillator so I can hear things.  When I press stop, it takes like 3 sec for the audio to stop.  And after pressing play, it takes about 3 sec.

I have a patch to set the running state only when the promise is resolved (which happens when the audio device starts requesting data).  This doesn't seem to change anything either.

Could you try with Chrome 57 (or canary) to see if things have changed?
I just tested with Canary 58.0.3022.3 and the behavior for me is the same as I am seeing with stable (where it switches to running about a half second before the context.currentTime starts incrementing).

One interesting thing though. If you look at the screen recording gif I added in the original description, you can see the time DOES jump up when the state changes to running, but it doesn’t start actually ticking up smoothly until about a half second later.

The amount of delay is also different depending on which AirPlay device I use. Going directly to my AirPort Express has a delay and also going to my Raspberry Pi (using shairport-sync), but when I select my Apple TV there is no delay.

Comment 12 by rtoy@chromium.org, Mar 6 2017

I tried this out again, but this time using speakers attached to an Airport Express base station.  I see what you see: pressing play causes the state to change immediately to running, but the context time doesn't start updating for half second or so.  However, when the state changes to running, I hear a small click.  I think that means the audio output has been restarted.  Then about a half second later, the context time starts running. Some time after that I can hear audio.

My CL didn't change this behavior so it will take a bit of time to understand what's happening and how to fix it.

Comment 13 by rtoy@chromium.org, Mar 7 2017

For the record, here is a slightly modified fiddle that plays a tone:  https://jsfiddle.net/e8oem50h/

Comment 14 by rtoy@chromium.org, Mar 7 2017

Oops.  Make that https://jsfiddle.net/e8oem50h/1/

Comment 15 by rtoy@chromium.org, Mar 8 2017

I tried again today and can no longer reproduce the issue, with any repro case.  Pressing play causes "running" to be displayed and the context time to start incrementing without pauses.

I believe the new code is correct.  The promise is delivered only when the audio context has restarted and the audio HW is requesting data.  At this point, the context state is updated when the promise is delivered.
Project Member

Comment 16 by bugdroid1@chromium.org, Mar 9 2017

The following revision refers to this bug:
  https://chromium.googlesource.com/chromium/src.git/+/dc4cda062ada67cd3c845149d58a4dc7545a00c1

commit dc4cda062ada67cd3c845149d58a4dc7545a00c1
Author: rtoy <rtoy@chromium.org>
Date: Thu Mar 09 21:23:49 2017

Set "Running" state only when actually running

When resuming an AudioContext, the state should be set to "running"
when the context has actually started running again.  Previously, this
was set when resume() was called.  Depending on the device, it make
take a while before the device actually starts running, so we were
prematurely setting the state to "running".

This can't be tested in a layout test with offline context.

BUG= 693978 
TEST=none

Review-Url: https://codereview.chromium.org/2717613007
Cr-Commit-Position: refs/heads/master@{#455867}

[modify] https://crrev.com/dc4cda062ada67cd3c845149d58a4dc7545a00c1/third_party/WebKit/Source/modules/webaudio/AudioContext.cpp
[modify] https://crrev.com/dc4cda062ada67cd3c845149d58a4dc7545a00c1/third_party/WebKit/Source/modules/webaudio/BaseAudioContext.cpp
[modify] https://crrev.com/dc4cda062ada67cd3c845149d58a4dc7545a00c1/third_party/WebKit/Source/modules/webaudio/BaseAudioContext.h

Comment 17 by rtoy@chromium.org, Mar 11 2017

Owner: rtoy@chromium.org
Status: Started (was: Available)
Ok.  That didn't actually fix it.  And I can reproduce the issue pretty reliably now.  It seems the key is to have to sources playing to the Airplay speakers.  In my case, Chrome beta using the url from c#14 and chromium ToT (with the CL) running the same url.

What I see is that when play is pressed, "running" is displayed AND the context time is changed from what it was before (context time incremented by about 25 ms).  Then it stops updating for about half a second and resumes again.

If OSX is really resuming and then pausing, there's not really much we can do about it.

It would great if you could try out Chrome canary and let us know what you get.
I am seeing the same behavior as you just described in canary. Do you think a bug should be filed with Apple/Safari then? 

The same behavior is present in Safari too, but I assumed that Safari borrowed most of its Web Audio API code from the webkit changes made by the Chrome team before webkit was forked so wasn’t sure if it was an OS issue or browser issue.

Comment 19 by rtoy@chromium.org, Mar 11 2017

So resuming shows the time updated, pausing for a bit and then resuming?

It can't hurt to file an issue with Apple.  The last time I looked, Safari's WebAudio implementation was really quite out-of-date compared to the current spec. Can't really say anything about the internal implementation, though.

And you got the heritage wrong.  Chrome's WebAudio was identical to Safari's until Blink forked from webkit.

I'm going to do a little more investigation, but if what I see is really true, then there's not much we can do.
Okay thanks! I will see about filing a ticket there. 

Also I don’t thing I got it wrong, I think my wording was just confusing :) 

I meant that the original implementation of the Web Audio API in SAFARI was copied from the Chrome implementation in webkit before chrome forked webkit as Blink.
Also yes, it shows the time update immediately then pause for about a half a second and then starts ticking up smoothly.

Comment 22 by rtoy@chromium.org, Mar 12 2017

Chrome and Safari shared the the WebAudio implementation because webkit was used for both Chrome and Safari.

Anyway, I did bit more testing and here is a partial log:

resolvePromisesForResume: currentFrame = 190208, time = 1489341530.92562
currentFrame = 190208, time = 1489341530.92565
currentFrame = 190336, time = 1489341530.92567
currentFrame = 190464, time = 1489341532.29049
currentFrame = 190592, time = 1489341532.29051
currentFrame = 190720, time = 1489341532.29643
currentFrame = 190848, time = 1489341532.29645
currentFrame = 190976, time = 1489341532.30208

The first line is when the promises for resume are being resumed.  (The context has restarted and the audio thread is posting a task to the main thread to handle the promises.)  The currentFrame is the context current frame (currentTime * sampleRate).  The time is the value of gettimeofday.

This shows that when the current frame goes from 190336 to 190464, the real time takes about 1.36 sec. I think this means when we restart the audio thread, the Mac audio system is restarted, and it continues for three render quanta and then pauses for 1.36 sec before continuing smoothly after that.

I'm inclined not to do anything about this.  The context did start as requested and we delivered the promises and updated the state as expected.  Not much we can do if the audio drivers decide to pause rendering for whatever reason.

Comment 23 by rtoy@chromium.org, Mar 15 2017

Status: Fixed (was: Started)
Closing as fixed.  If you think this is incorrect, please re-open or file a new bug (with component Blink>WebAudio)


Sign in to add a comment