New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.

Issue 625208 link

Starred by 9 users

Issue metadata

Status: Duplicate
Merged: issue 703122
Owner:
Closed: Sep 12
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: Windows
Pri: 1
Type: Feature

Blocked on:
issue 651800



Sign in to add a comment

WebRtc: Need support for getContributingSources in Chrome

Reported by rajsripe...@gmail.com, Jul 1 2016

Issue description

UserAgent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36

Steps to reproduce the problem:
The current WebRTC implementation does not give an indication of which speakers are contributing to the mixed audio received by the client. The proposal which is part of WebRTC 1.0 specification (http://w3c.github.io/webrtc-pc)is to extend the RTCRtpReceiver in the following way.

The getContributingSources method extends the RTCRtpReceiver object, providing information on the sources contributing to an incoming media stream.
partial interface RTCRtpReceiver {
    sequence<RTCRtpContributingSource> getContributingSources ();
};
interface RTCRtpContributingSource {
	//Time of the reception of the most recent RTP packet containing the source
    readonly        attribute DOMHighResTimeStamp timestamp;
    readonly        attribute unsigned long       source;
    //The audio level of the contributing source.Value is between 0 and - 127     representing the contributing source dBov value, as described in[RFC6465]
    readonly        attribute byte?               audioLevel;
//Whether the last RTP packet received from this source contains voice activity (true) or not (false).    
	readonly        attribute boolean?            voiceActivityFlag;
};

What is the expected behavior?
This behaviour will enable the application running on WebRTC to do things like the following in a conference.
1) In multi-view, show a visual indication on the current active speakers.
2) In dominant speaker based single-view, display the name of the user whose video is currently shown.

What went wrong?
Lack of this support will make it difficult for applications to do the above.

Did this work before? No 

Chrome version: 51.0.2704.103  Channel: canary
OS Version: 10.0
Flash Version: Shockwave Flash 22.0 r0
 

Comment 1 by b...@chromium.org, Jul 1 2016

Components: Blink>WebRTC
Status: Untriaged (was: Unconfirmed)
Marking the above issue as Untriaged as this is a feature request.

Dev team will take a call on the above request.


Thank you!
Cc: hta@chromium.org
Lack of support for CSRC events will require the app to create expensive workarounds involving transmitting this information on the signalling channel. We are currently debating whether to implement this workaround or wait for Chrome to implement this feature. It will be helpful if you can give us some idea on when we can expect this feature to light up. 

Comment 5 by hta@chromium.org, Jul 15 2016

Labels: -Pri-2 Pri-3
If you need a predictable timeline, I recommend doing the workaround.

The feature is on the list of features to support, but since it only affects interoperation with MCUs (meshes, SFUs and transport relays are not affected, since they don't use contributing sources), it has not received a high priority so far, and we have not committed to a schedule for implementation.

The workaround is not very straight forward as signalling channel is not real time and the CSRC information can get stale by the time it reaches the client. 
Workaround is also costly as it involves changing multiple entities in the path. We will start the implementation of the work around but it will be great if you can give this ask a higher priority.

Comment 7 by guidou@chromium.org, Oct 26 2016

Components: -Blink>WebRTC Blink>WebRTC>Network
Blockedon: 651800
Cc: deadbeef@chromium.org
Labels: WebRTCTriaged
Status: Available (was: Untriaged)

Comment 9 by reniar...@gmail.com, Jul 17 2017

Is it normal for the timestamps to be of an integer type, not a decimal? (see attached screenshot, the last value is from performance.now())

How to compare these timestamps to know how many seconds/milliseconds have passed? From WebRTC 1.0 documentation, the type DOMHighResTimeStamp documentation leads to https://www.w3.org/TR/hr-time-2/ which suggests that the value should be of decimal type, but as you can see on the screenshot, it comes as an integer, and of a completely different scale.

sources.PNG
28.4 KB View Download
DOMHighResTimeStamp is in units of milliseconds, and we happen to be using a clock value with only millisecond precision, so it's always an integer (for now).

This timestamp is not guaranteed to be in units relative to "performance.now" or even since Jan 1 1970, it's only guaranteed to use "a local clock". So I think we're standards-compliant here. You can compare the times to see, for example, that audio was received from CSRC 1 180ms more recently than CSRC 8.
I see. How does one determine how much time has passed since this timestamp? I understand it's a local timestamp, but relative to what then? If I want to avoid a nasty unreliable hack for comparing it, it limits its usage quite a lot if I don't know how much time has passed.
Could you explain more about what you're trying to do? Is it not enough to assume that the most recent timestamp represents a packet that arrived pretty recently? I can try raising the issue with the WebRTC working group, but they're pretty resistant to changes at this point, so the more information I have to make a case the better.
What I'm trying to do is make it more reactive - when a participant stops talking, at the moment, it's kept in the list of contributing sources for another 10 seconds (goal is to reduce that). I'd like to compare the timestamp to a current value, to see how many msec ago the source was contributing (so instead of 10 seconds, it can be easily and reliably configured to something lower), which is hard/hacky to do without knowing the reference point of the timestamp (getting current value of the timestamp). I'd like to treat each contributing source individually, not compare the timestamps between each other.

Comment 14 by hta@webrtc.org, Oct 3 2017

These 2 places are the only references in the WebRTC spec to "clock". How much effort would it take to make these clocks return the same clock as performance.now() (and make the corresponding change to the spec)?

Comment 15 by hta@webrtc.org, Oct 3 2017

alternatively, make it equivalent with the GetStats() timestamp, which is the DOMHighResTimeStamp version of unix gettimeofday(2).
Components: -Blink>WebRTC>Network Blink>WebRTC>PeerConnection
Labels: -Pri-3 Pri-1
Owner: hbos@chromium.org
Status: Assigned (was: Available)
RTCRtpContributingSources has already landed, but the audioLevel field is missing.
Assigning to hbos@ to complete this.
Mergedinto: 703122
Status: Duplicate (was: Assigned)
This bug is a bit old. The spec has changed, it's better to split this up:
- getContributingSources(): Already shipped a long time ago,  https://crbug.com/703122 .
- getSynchronizationSources(): Started,  https://crbug.com/883287 .
- RTCRtpContributingSource.audioLevel: Filed https://crbug.com/883288.
- RTCRtpSynchronizationSource.voiceActivityFlag: Filed https://crbug.com/883289.

I'm going to close this as a duplicate. Do look at the other bugs for updates on the related bugs.

Sign in to add a comment