Refactor viz::GLHelper to have an async YUV readback component from texture |
||||
Issue descriptionThis idea came up in issue 727385 . - We shouldn't do it using viz::GLHelper, but rather move it into a smaller component. - This functionality should read back YUV. This would be an improvement over ARGB and convert(32 bpp->12/20bpp). - It requires some more piping about alpha channel and flipped origin from SkImage. - Another reason we should proceed with YUV readback, other than additional performance benefits, is color space conversions: libyuv currently uses BT.601 conversion factors, but the media video pipeline assumes BT.709 (the difference between the old standard-definition TV and HDTV).
,
Jan 12 2018
We had a discussion with miu@ and concluded to extract the code for reading TEXTURE into CPU readable YUV or ARGB buffers into a smaller component. It can ideally live inside renderer process, can be constructed with the context and used by multiple clients, couple WebRTC cases already exist. We can start by making a fork of the existing code in GLHelper. Note that if there is a Skia alternative for this, i.e. SkImage::readYUVPixelsAsync(), all of this would be actually unnecessary. bsalomon@ do you have any input on the feasibility of that? Also, junov@ would this be useful for any Blink calls as well?
,
Jan 12 2018
Something like that is certainly possibly. If we did something like this, what would be the mechanism for the client to know that the result is ready? GLsync object returned from the call?
,
Jan 12 2018
I am not sure about the native callbacks between skia and Chrome. But we can have a AsyncReadClient interface defined on skia side, an instance in Chrome can be passed with the call and AsyncReadClient::OnPixelsRead() would let Chrome side know that it is ready?
,
Jul 25
|
||||
►
Sign in to add a comment |
||||
Comment 1 by laforge@google.com
, Nov 8 2017