Add support for IOSurface implementation of WebGL with alpha:false |
||||||
Issue descriptionI'm breaking this out Issue 581777 because there is going to be a lot of discussion, and that issue is cluttered with other IOSurface-related bugs. The current implementation of WebGL supports alpha:false by attaching a texture with internal format GL_RGB to the backbuffer. This doesn't work for IOSurfaces because IOSurfaces don't appear to work correctly with internal format GL_RGB. --------------Explanation of broken behavior of IOSurfaces-------------------- Quoting OpenGL 4.0 spec, section 4.3.1: https://www.opengl.org/registry/doc/glspec40.core.20100311.pdf """ When READ_FRAMEBUFFER_BINDING is non-zero, the red, green, blue, and alpha values are obtained by first reading the internal component values of the corresponding value in the image attached to the selected logical buffer. Internal components are converted to an RGBA color by taking each R, G, B, and A component present according to the base internal format of the buffer (as shown in table 3.11). If G, B, or A values are not present in the internal format, they are taken to be zero, zero, and one respectively. """ If we try this out, we see that a texture with standard allocation has the right behavior (A=1), but a texture backed by an IOSurface has the wrong behavior (A=0). https://bugs.chromium.org/p/chromium/issues/detail?id=581777#c26 If we read directly from the IOSurface, we see that A=0. There is no spec for IOSurfaces, so it seems unlikely Apple will change this behavior. More worrying for us, quoting ccameron, once again from Issue 581777 . """ So it appears that they're programming the hardware to zero-out all Alpha bits it touches when drawing to a GL_RGB-ified BGRA IOSurface. """ Furthermore, both ccameron@ and I have had problems with the Core Animation side of GL_RGB IOSurfaces. They just appear to behave incorrectly. See the opening post on Issue 581777 .
,
Mar 19 2016
Interestingly a similar problem has come up on other platforms -- see http://crbug.com/449150 and https://codereview.chromium.org/1700433002/ . It seems fine to me to have slightly undefined behavior in this case. These textures will never be exposed directly to the web, so web developers will never see the difference. Chromium developers might need to be a bit careful, but the IOSurfaces should always be zeroed out, so there is no security implication of displaying random garbage.
,
Mar 24 2016
I have what appears to be a working implementation of GL_RGB emulation: https://codereview.chromium.org/1830453002/ I'm breaking it into smaller CLs, and I also need to fix some plumbing along the way. According to the documentation for CreateGpuMemoryBufferImageCHROMIUM as well as IOSurface (CGLTexImage2D...), the "internalformat" must be either GL_RGB or GL_RGBA. But callers of CreateGpuMemoryBufferImageCHROMIUM on OS X currently pass in GL_BGRA! That's confusing, so I wrote a CL to fix this: https://codereview.chromium.org/1832923002/ This CL runs into some test failures. After doing some digging... ResourceProvider::LazyCreateImage calls gl->CreateImageCHROMIUM, passing internalformat=GLInternalFormat(resource->format). On Mac, when resource->format is BGRA_8888, internalformat gets set to GL_BGRA. There appears to be a lot of confusion and inconsistency in the handling of CHROMIUM images. As I see it: 1. Every image has an associated pixel format. (Henceforth known as "format") 2. When an image is bound to a texture, an internal format must be specified. (Henceforth known as "internalformat"). 3. Chromium extensions to GLES (such as CreateGpuMemoryBufferImageCHROMIUM and CreateImageCHROMIUM) take an "internalformat", but not a "format". Since the former needs to create an image, there exists a function that maps from internalformat -> format. This mapping currently lives in ImageFactory::DefaultBufferFormatForImageFormat. This converts from GLenum -> gfx::BufferFormat. 4. cc passes around resources that only include a "format". When the resource needs to be bound to a texture, ResourceProvider calls gl->CreateImageCHROMIUM and uses the function GLInternalFormat to convert from cc::ResourceFormat -> GLenum (to get an "internalformat"). 5. [For the cases I'm interested in] when an IOSurface is bound to a texture, its "format" is GL_BGRA but its "internalformat" is GL_RGBA. Neither of the conversion functions in (3) or (4) perform this conversion, which is why the existing code is really confusing. CreateGpuMemoryBufferImageCHROMIUM gets passed internalformat=GL_BGRA which only bears loose resemblance to either the "format" or the "internalformat", both of which get determined by gl_image_io_surface.mm. [The logic in the anonymous functions TextureFormat() and DataFormat() are a dead giveaway that something weird is going on.] ----------------------------------- [Will follow up with proposed solutions]
,
Mar 25 2016
Under the current interface, BindTexImage2DChromium does not accept an "internalformat" parameter, but the implementation requires one, so this means that the state must already live on the Image. This is currently done in CreateGpuMemoryBufferImageCHROMIUM and CreateImageCHROMIUM, both of which accept an "internalformat" parameter, but not a "format" parameter. As far as I can tell, the only reason that CreateImageCHROMIUM takes an "internalformat" is to keep it around for an eventual call to BindTexImage2DChromium. [From a semantics perspective, the ClientBuffer already has a pixel format, so there is no point of passing in the same parameter again. This currently gets DCHECKED with IsImageFormatCompatibleWithGpuMemoryBufferFormat, but failing to pass in the same parameter should actually be an error.] CreateGpuMemoryBufferImageCHROMIUM is similar to CreateImageCHROMIUM, but also infers the "format" from the "internalformat" using ImageFactory::DefaultBufferFormatForImageFormat. ------------------------------- My suggestion is: 1. Change BindTexImage2DChromium() to accept an "internalformat" parameter. 2. Change CreateImageCHROMIUM() to accept neither "format" nor "internalformat". 3. Change CreateGpuMemoryBufferImageCHROMIUM() to accept a "format" parameter. kbr, reveman: Does this sound reasonable?
,
Mar 25 2016
reveman@ would be the best person to evaluate these changes. They seem OK to me but I don't know this extension well. To be sure: BindTexImage2DCHROMIUM only needs the internalformat, and so the needed information will be available from images created both via CreateImageCHROMIUM and CreateGpuMemoryBufferImageCHROMIUM? The image's original format wasn't needed?
,
Mar 25 2016
> The image's original format wasn't needed? It is needed, but the information is already present on the GLImage.
,
Mar 25 2016
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/4d8dc295cccc146320088799dd9af2f90d71232d commit 4d8dc295cccc146320088799dd9af2f90d71232d Author: erikchen <erikchen@chromium.org> Date: Fri Mar 25 04:37:08 2016 Clean up calls to CreateGpuMemoryBufferImageCHROMIUM. This CL should not induce any behavior changes. IOSurfaces expect a format of GL_BGRA, and an internal format of GL_RGBA. Callers of CreateGpuMemoryBufferImageCHROMIUM were incorrectly passing in GL_BGRA as |internalformat|, and the implementation of CreateGpuMemoryBufferImageCHROMIUM was incorrectly allowing this. This CL updates callsites to pass in GL_RGBA, and updates the implementation of CreateGpuMemoryBufferImageCHROMIUM to not accept GL_BGRA. BUG= 595948 CQ_INCLUDE_TRYBOTS=tryserver.chromium.win:win_optional_gpu_tests_rel;tryserver.chromium.mac:mac_optional_gpu_tests_rel Review URL: https://codereview.chromium.org/1832923002 Cr-Commit-Position: refs/heads/master@{#383243} [modify] https://crrev.com/4d8dc295cccc146320088799dd9af2f90d71232d/content/browser/compositor/gpu_process_transport_factory.cc [modify] https://crrev.com/4d8dc295cccc146320088799dd9af2f90d71232d/gpu/command_buffer/client/gles2_implementation.cc [modify] https://crrev.com/4d8dc295cccc146320088799dd9af2f90d71232d/gpu/command_buffer/service/image_factory.cc [modify] https://crrev.com/4d8dc295cccc146320088799dd9af2f90d71232d/gpu/command_buffer/tests/gl_copy_tex_image_2d_workaround_unittest.cc [modify] https://crrev.com/4d8dc295cccc146320088799dd9af2f90d71232d/gpu/command_buffer/tests/gl_iosurface_readback_workaround_unittest.cc [modify] https://crrev.com/4d8dc295cccc146320088799dd9af2f90d71232d/third_party/WebKit/Source/platform/graphics/Canvas2DLayerBridge.cpp
,
Mar 25 2016
I'm not following exactly what the problem is here but let me reiterate some basics about format and internalformat first so there's no confusion as to what they are used for. gfx::BufferFormat "format" - this is a pixel format and it describes the exact layout of pixels in a buffer. This needs to be known to anyone trying to access the buffer using the CPU. GLenum "internalformat" - this describes how a buffer behaves when introduced to the GL. ie. what is the result of sampling out of the buffer when bound to a texture. There's not necessarily a 1-to-1 mapping between format and internalformat, a RGBA_8888 buffer could technically be introduced to the GL both as GL_RGBA and GL_RGB. The latter would make sampling out of the buffer ignore the alpha channel and always return alpha=1. gfx::BufferFormat is an implementation detail in the CreateGpuMemoryBufferImageCHROMIUM case as the resulting buffer cannot be accessed by the CPU. We can use whatever is best on a specific platform when implementing this extension as long as we respect the internalformat which defines how the GL behaves. To more explicitly respond to the suggestions above: > 1. Change BindTexImage2DChromium() to accept an "internalformat" parameter. > 2. Change CreateImageCHROMIUM() to accept neither "format" nor "internalformat". This would make the implementation more complicated as it means that an image could be bound to different textures using different internalformats. Is this really needed? I'd prefer to keep internalformat as an immutable part of the image state. > 3. Change CreateGpuMemoryBufferImageCHROMIUM() to accept a "format" parameter. Passing a gfx::BufferFormat to CreateGpuMemoryBufferImageCHROMIUM is confusing as it gives the client control over an implementation detail that it should not be aware of. What would be the gain of this?
,
Mar 25 2016
reveman: I think you may have missed comment #3, where I specify definitions for "format" and "internalformat", which mostly align with yours [although I think mine are more precise]. I also indicate several problems with the existing code/API. Before I respond to your questions, I need to clarify a point. "format" describes the pixel layout in a buffer, but is not necessarily an instance of gfx::BufferFormat. For example, cc::ResourceFormat is also used to describe pixel layout. To explicitly respond to your questions: """ There's not necessarily a 1-to-1 mapping between format and internalformat, a RGBA_8888 buffer could technically be introduced to the GL both as GL_RGBA and GL_RGB. The latter would make sampling out of the buffer ignore the alpha channel and always return alpha=1. """ I agree that there /should not/ be a 1-to-1 mapping [I think you mean bijection, or 1:1 correspondence] between format and internal format. I in fact need to do exactly your example case, where sometimes I require the pairing (format=BGRA internalformat=RGBA), and sometimes the pairing (format=BGRA internalformat=RGB). The current APIs assume the existence of a bijection, and there currently exist 2 different functions that perform this mapping. 1. The function CreateGpuMemoryBufferImageCHROMIUM takes as an input a "internalformat", but needs to both create a buffer and bind it to a texture. It accomplishes this by using the function ImageFactory::DefaultBufferFormatForImageFormat to map from (GLenum "internalformat" -> gfx::BufferFormat "format"). 2. The function CreateImageCHROMIUM requires an "internalformat" parameter, even though the result will be unused unless BindTexImage2DChromium is called. This means that when ResourceProvider::LazyCreateImage calls CreateImageCHROMIUM, it needs to make up an "internalformat", even though the result may never be bound to a texture. Even if it will be bound, there is no way for ResourceProvider to know what "internalformat" The image will be bound as. The function GLInternalFormat() gets used to maps from (cc::ResourceFormat "format" -> GLenum "internalformat"). """ Passing a gfx::BufferFormat to CreateGpuMemoryBufferImageCHROMIUM is confusing as it gives the client control over an implementation detail that it should not be aware of. What would be the gain of this? """ I am not passing a gfx::BufferFormat. I am passing a GLenum that describes the pixel format of the underlying image. This is analogous to texImage2D, which takes both a "format" and "internalformat" parameter. As I described in (1), unless you want to assume the existence of a mapping from "internalformat" -> "format", we need to pass a "format" parameter.
,
Mar 25 2016
I'm OOO so I haven't been able to properly read all the comments here yet. Trying to cut to the point and provide some feedback before I get back: I don't think emulating RGB using RGBA behind the CHROMIUM_image interface is appropriate. I'd rather not see that complexity added to the implementation of this extension. This seems like something the compositor should instead take care of by ignoring the alpha channel in RGBA buffers. I'm happy to have a more detailed discussion about this when back in NYC end of next week.
,
Mar 25 2016
There are two separate issues: 1) Whether we should modify CHROMIUM_image to support internalformat = GL_RGB for IOSurfaces. It currently does not. [Emulation would be an implementation detail, not exposed to the caller]. 2) The current interface for CHROMIUM_image is broken. I'm not sure how else to describe this. Quoting you: """ There's not necessarily a 1-to-1 mapping between format and internalformat, a RGBA_8888 buffer could technically be introduced to the GL both as GL_RGBA and GL_RGB. The latter would make sampling out of the buffer ignore the alpha channel and always return alpha=1. """ Yet there currently exists two different 1-to-1 mappings that do exactly what you just said should not be done. The currently interface for CHROMIUM_image functions FORCES this mapping to exist, in both directions.
,
Mar 25 2016
Maybe split this into two separate bugs? What are the 1-to-1 mappings that you refer to in 2? There's https://code.google.com/p/chromium/codesearch#chromium/src/gpu/command_buffer/service/image_factory.cc&l=54 but there's nothing wrong with doing that as it's only used to implement CHROMIUM_gpu_memory_buffer_image and not exposed as part of the interface. When a client is creating an image using CHROMIUM_gpu_memory_buffer_image the implementation of this extension can chose whatever GMB format it prefers as long as it can satisfy the internalformat request of the client. DefaultBufferFormatForImageFormat is currently how we decide what GMB format to use. What's the other case where we maintain a 1-to-1 mapping between internalformat and gfx::BufferFormat?
,
Mar 25 2016
See the paragraph in comment #9 that begins with "2." for the other 1:1 mapping. No consumer of CreateImageCHROMIUM should ever request an "internalformat" of GL_BGRA (because that's not a real "internalformat", it's a pixel format), but the current API forces clients (like cc::ResourceProvider) to do this, and forces the existence of 2 mappings between "format" <-> "internalformat". Note that the version of DefaultBufferFormatForImageFormat you linked looks reasonable only because I just fixed it: https://codereview.chromium.org/1832923002/ That CL fixes the API problems enough for me to implement RGB "internalformat" emulation. I wanted to fix the entire suite of CHROMIUM Image APIs while I was at it to prevent this class of problems in the future, but there seems to be some type of impedance mismatch. Filed a new bug: https://bugs.chromium.org/p/chromium/issues/detail?id=598050
,
Mar 28 2016
I agree that passing GL_BGRA to CreateImage is a bit confusing. FWIW, it used to be that only GL_RGB and GL_RGBA were valid values but we changed that to be more consistent with GLES. The reason for the current behavior is that GLES 2.0 forces the use of pixel formats for internalformat as a result of not allowing data format and internalformat to be different when defining a texture. I think that's confusing in general but I think it's still best to keep CHROMIUM_image consistent with that. I understand that CGLTexImageIOSurface2D is different. That's because it's an OpenGL extension rather than a GLES/EGL extension. However, the Chromium command buffer and our Chromium specific extensions are supposed to be written against GLES rather than OpenGL.
,
Apr 4 2016
reveman, kbr: Comments requested on my proposed solutions. I have a CL that implements client side GL_RGB emulation: https://codereview.chromium.org/1856933002/ It passes all but 1 WebGL test locally when I run with --enable-webgl-image-chromium. copyTexSubImage2D needs to fail when the back buffer has format GL_RGB [emulated] and the destination texture has format GL_RGBA. There is currently no mechanism for the client to determine the format of the destination texture, and the service side doesn't see any problems since the back buffer has [real] format GL_RGBA. I see several possible solutions: 1. Keep a map of all texture formats client side. [This does not seem ideal, but keeps all logic out of the service side]. 2. Tell the service about "emulated" nature of the backbuffer. This will require an API modification. proposals: 2a. Modify either CreateGpuMemoryBufferImageCHROMIUM or BindTexImage2DCHROMIUM to take an additional parameter (GLboolean: emulatesRGB). Store this flag on GLImage. 2b. Modify CreateGpuMemoryBufferImageCHROMIUM to allow format GL_RGB on OS X. This sets a flag on GLImage like (2a), but still creates a texture with GL_RGBA format. (2b) doesn't require an API modification, but I think it is more confusing.
,
Apr 5 2016
Why doesn't we know the format for the destination texture on the client side?
,
Apr 5 2016
We don't record that information because it hasn't been necessary. I take it that you would prefer we duplicate that information in the client?
,
Apr 5 2016
No - the client side (WebGLRenderingContextBase) used to track this information redundantly, and keeping it in sync with the service side is strongly undesirable. Doing to requires all of the texture validation code to be duplicated to the client side. zmo@ just removed all of this tracking code, and we do not want to put it back in just for this purpose. Let's tell the service side about the emulated semantic of the back buffer so it can generate the necessary error (INVALID_OPERATION? doesn't seem to be specified in the man pages, and haven't checked the spec). This seems to be option 2a. Does this sound OK?
,
Apr 6 2016
In an attempt to forestall further confusion: Most of GL_RGB emulation will be done by the client, but some minimal service side support is necessary. client changes: https://codereview.chromium.org/1856933002/ service changes: https://codereview.chromium.org/1870483003/
,
Apr 12 2016
I just chatted on reveman@. He's okay with this, conditional on: 1. I check to see what Safari does. 2. I finish filing the bug against Apple. https://bugs.chromium.org/p/chromium/issues/detail?id=581777#c26 3. I update the documentation on the spec and GLImage to make it clear that the EmulatingRGB() function is a workaround to deal with this Apple bug, and is not intended to be a permanent part of the interface. As an aside, reveman prefers keeping the CHROMIUM image interface intact.
,
Apr 12 2016
Starting to dig into Webkit:
This looks familiar:
"""
// FIXME: remove this section when GL driver bug on Mac AND the GLES driver bug
// on QC is fixed, i.e., when alpha is off, readPixels should
// set alpha to 255 instead of 0.
if (!m_framebufferBinding && !m_context->getContextAttributes().alpha) {
"""
Alas, it was added by zmo@ and reviewed by kbr@ to work around an unspecified "driver bug", so no additional insight there.
https://bugs.webkit.org/show_bug.cgi?id=34718
Digging further, we see
"""
if (m_attrs.alpha) {
m_internalColorFormat = GL_RGBA8;
colorFormat = GL_RGBA;
} else {
m_internalColorFormat = GL_RGB8;
colorFormat = GL_RGB;
}
...
::glTexImage2D(GL_TEXTURE_2D, 0, m_internalColorFormat, width, height, 0, colorFormat, GL_UNSIGNED_BYTE, 0);
"""
https://trac.webkit.org/browser/trunk/Source/WebCore/platform/graphics/opengl/GraphicsContext3DOpenGL.cpp#L172
Disappointingly, they don't use an IOSurface as storage.
Will dig more to see what they do to get this texture to screen.
,
Apr 12 2016
They copy from the texture in:
Source/WebCore/platform/graphics/mac/WebGLLayer.mm
"""
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, _context->platformTexture());
glBegin(GL_TRIANGLE_FAN);
glTexCoord2f(0, 0);
glVertex2f(-1, -1);
glTexCoord2f(1, 0);
glVertex2f(1, -1);
glTexCoord2f(1, 1);
glVertex2f(1, 1);
glTexCoord2f(0, 1);
glVertex2f(-1, 1);
glEnd();
glBindTexture(GL_TEXTURE_2D, 0);
glDisable(GL_TEXTURE_2D);
// Call super to finalize the drawing. By default all it does is call glFlush().
[super drawInCGLContext:glContext pixelFormat:pixelFormat forLayerTime:timeInterval displayTime:timeStamp];
"""
,
Apr 12 2016
,
Apr 16 2016
reveman: Response requested. piman@ requested that I remove the #ifdef in DrawingBuffer.cpp that sets the CHROMIUM image texture target to RECTANGLE for Mac. Is there a way of querying this information outside of tests? If not, I'm going to add a capability. https://codereview.chromium.org/1870483003/#msg26 https://code.google.com/p/chromium/codesearch#chromium/src/third_party/WebKit/Source/platform/graphics/gpu/DrawingBuffer.cpp&q=drawingbuffer.cpp&sq=package:chromium&type=cs&l=341
,
Apr 16 2016
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/6e4845661eba8e56f36aa32c788f859ebafc8403 commit 6e4845661eba8e56f36aa32c788f859ebafc8403 Author: erikchen <erikchen@chromium.org> Date: Sat Apr 16 01:23:39 2016 Add command buffer support for GL_RGB CHROMIUM image emulation. The CHROMIUM image is always created with internal format GL_RGBA. The caller of CreateGpuMemoryBufferImageCHROMIUM is responsible for most emulation. The only support that the command buffer provides is that calls to copyTexImage2D and copyTexSubImage2D will perform parameter validation as if the internal format were RGB. BUG= 595948 CQ_INCLUDE_TRYBOTS=tryserver.chromium.linux:linux_optional_gpu_tests_rel;tryserver.chromium.mac:mac_optional_gpu_tests_rel;tryserver.chromium.win:win_optional_gpu_tests_rel Review URL: https://codereview.chromium.org/1870483003 Cr-Commit-Position: refs/heads/master@{#387779} [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/content/renderer/webgraphicscontext3d_provider_impl.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/content/renderer/webgraphicscontext3d_provider_impl.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/GLES2/extensions/CHROMIUM/CHROMIUM_gpu_memory_buffer_image.txt [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/common/capabilities.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/framebuffer_manager.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/framebuffer_manager.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/gles2_cmd_decoder.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/gles2_cmd_decoder_unittest_base.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/gles2_cmd_decoder_unittest_textures.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/texture_manager.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/texture_manager.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/command_buffer/service/texture_manager_unittest.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/ipc/common/gpu_command_buffer_traits_multi.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/gpu/ipc/common/gpu_memory_buffer_support.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/third_party/WebKit/Source/platform/graphics/Canvas2DLayerBridgeTest.cpp [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/third_party/WebKit/Source/platform/graphics/DEPS [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/third_party/WebKit/Source/platform/graphics/gpu/DEPS [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/third_party/WebKit/Source/platform/graphics/gpu/DrawingBufferTest.cpp [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/third_party/WebKit/public/platform/WebGraphicsContext3DProvider.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/ui/gfx/mac/io_surface.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/ui/gl/BUILD.gn [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/ui/gl/gl.gyp [add] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/ui/gl/gl_image.cc [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/ui/gl/gl_image.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/ui/gl/gl_image_io_surface.h [modify] https://crrev.com/6e4845661eba8e56f36aa32c788f859ebafc8403/ui/gl/gl_image_io_surface.mm
,
Apr 18 2016
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/89c999c6a291fd15e3030faee393611703f6a9a3 commit 89c999c6a291fd15e3030faee393611703f6a9a3 Author: perkj <perkj@chromium.org> Date: Mon Apr 18 13:50:10 2016 Initialize Capabilities.chromium_image_rgb_emulation for gles2 This is missed in https://codereview.chromium.org/1870483003 "Add command buffer support for GL_RGB CHROMIUM image emulation." and leads to lots of ASAN problem in MSAN. See https://build.chromium.org/p/chromium.webkit/builders/WebKit%20Linux%20MSAN/builds/9525 BUG= 595948 TBR=erikchen@chromium.org CQ_INCLUDE_TRYBOTS=tryserver.chromium.linux:linux_optional_gpu_tests_rel;tryserver.chromium.mac:mac_optional_gpu_tests_rel;tryserver.chromium.win:win_optional_gpu_tests_rel Review URL: https://codereview.chromium.org/1896833002 Cr-Commit-Position: refs/heads/master@{#387898} [modify] https://crrev.com/89c999c6a291fd15e3030faee393611703f6a9a3/gpu/command_buffer/common/capabilities.cc
,
Apr 20 2016
I'm going to file a radar for "Multisample renderbuffers don't work correctly with glColorMask() and glClear().". I ran into this bug while trying to land client side RGB emulation. We can work around it pretty easily.
,
Apr 20 2016
Filed radar 25837054.
,
Apr 27 2016
My second plan for multisampled alpha=false was to make a RGB renderbuffer, and blit from that into a RGBA IOSurface backed texture. This appears to work, but the Blit call also returns GL_INVALID_OPERATION. According to the OpenGL 4.0 spec: """ If SAMPLE_BUFFERS for either the read framebuffer or draw framebuffer is greater than zero, no copy is performed and an INVALID_OPERATION error is generated if the dimensions of the source and destination rectangles provided to BlitFramebuffer are not identical, if the formats of the read and draw framebuffers are not identical, or if the values of SAMPLES for the read and draw buffers are not identical. """ The first two conditions would explain the failiure. The last one makes no sense, since it implies that no resolution is possible. Looking at the original multisample spec: https://www.opengl.org/registry/specs/EXT/framebuffer_multisample.txt """ Furthermore, if SAMPLE_BUFFERS for either the read framebuffer or draw framebuffer is greater than zero, no copy is performed and an INVALID_OPERATION error is generated if the dimensions of the source and destination rectangles provided to BlitFramebuffer are not identical, or if the formats of the read and draw framebuffers are not identical." """ This one makes more sense - we can't perform an explicit resolve from GL_RGBA to GL_RGB.
,
Apr 27 2016
This leaves us with two options: 1) Perform the resolve to an intermediary renderbuffer with internal format GL_RGB, and then blit from there to the destination texture [requires an extra blit]. 2) Use client side glColorMask() and glClear() for an RGBA multisample renderbuffer [emulating RGB]. Disable IOSurface for WebGL with no-alpha + multisample on ATI GPUs (should be rare?) I guess we could do (2) most of the time, and only (1) if we detect an ATI GPU if we were willing to put up with additional complexity. I guess I'll try (1) for now, and we can add (2) as an optimization if we deem it necessary.
,
Apr 28 2016
I'm not clear on the current state of the RGB emulation code. I thought it was already allocating an RGBA format texture to emulate an RGB format texture, therefore basically doing (2). I think the reason that blitting from an RGB renderbuffer to an RGBA destination texture fails is that it would have to synthesize the alpha channel -- it's OK to drop channels during the copy, but not to have to create new ones. (2) should work portably in my experience. Does it fail on ATI GPUs?
,
Apr 28 2016
Non-multisampled RGB emulation is working fine. (2) is the preferred solution for multisampled RGB emulation, but doesn't work on ATI GPUs due to a driver bug (Comment #27). I implemented (1) as the general solution. Depending on how we feel, we may wish to also implement (2) [not very difficult] and use (1) as a fallback.
,
May 4 2016
,
May 4 2016
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/7e3811e8e71885af851e4b5a444a2715756de96e commit 7e3811e8e71885af851e4b5a444a2715756de96e Author: erikchen <erikchen@chromium.org> Date: Wed May 04 03:02:01 2016 WebGL GL_RGB emulation to support IOSurface back buffers on Mac. This CL creates a client side implementation of GL_RGB emulation of the WebGL back buffer. The emulation is necessary because GL_RGB IOSurfaces don't work correctly. The emulation is composed of several simple steps: 1. The service side is told that the back buffer has format GL_RGB, but it will create a GL_RGBA texture as per the chromium_image_rgb_emulation capability. 2. When the back buffer is first created, its alpha channel is cleared ot 1. 3. Any time the back buffer would be written to, the alpha channel of the color mask is set to 0. 4. Queries about the alpha channel of the back buffer are modified to return the correct result. Explicit multisample renderbuffer resolve takes a different path. 1. The back buffer is still created with emulated format GL_RGB. 2. The renderbuffer is given format GL_RGB. 3. During resolution, the renderbuffer is blitted to an intermediate non-multisample renderbuffer with format GL_RGBA. 4. The intermediate renderbuffer is then blitted to the destination texture. The multisample path uses an extra blit. Theoretically, it should be possible to only use a single blit by using glColorMask() just like the non-multisample case. Driver bugs in ATI gpus make this solution not universally applicable. BUG= 595948 CQ_INCLUDE_TRYBOTS=tryserver.chromium.win:win_optional_gpu_tests_rel;tryserver.chromium.mac:mac_optional_gpu_tests_rel Review-Url: https://codereview.chromium.org/1856933002 Cr-Commit-Position: refs/heads/master@{#391428} [modify] https://crrev.com/7e3811e8e71885af851e4b5a444a2715756de96e/third_party/WebKit/Source/modules/webgl/WebGL2RenderingContextBase.cpp [modify] https://crrev.com/7e3811e8e71885af851e4b5a444a2715756de96e/third_party/WebKit/Source/modules/webgl/WebGLRenderingContextBase.cpp [modify] https://crrev.com/7e3811e8e71885af851e4b5a444a2715756de96e/third_party/WebKit/Source/modules/webgl/WebGLRenderingContextBase.h [modify] https://crrev.com/7e3811e8e71885af851e4b5a444a2715756de96e/third_party/WebKit/Source/platform/BUILD.gn [modify] https://crrev.com/7e3811e8e71885af851e4b5a444a2715756de96e/third_party/WebKit/Source/platform/graphics/gpu/DrawingBuffer.cpp [modify] https://crrev.com/7e3811e8e71885af851e4b5a444a2715756de96e/third_party/WebKit/Source/platform/graphics/gpu/DrawingBuffer.h
,
May 5 2016
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/04a443352b363a4fdecee6c243ff5c02566a6795 commit 04a443352b363a4fdecee6c243ff5c02566a6795 Author: erikchen <erikchen@chromium.org> Date: Thu May 05 09:16:57 2016 Add plumbing for two gpu workarounds, soon to be used by DrawingBuffer. The workaround disable_multisample_renderbuffer_color_mask is needed because glColorMask does not work for multisample renderbuffers on old AMD GPUs. The workaround disable_rgb_multisample_renderbuffer is needed because multisample renderbuffers with format GL_RGB8 have performance issues on Intel GPUs. BUG= 595948 , 607130 CQ_INCLUDE_TRYBOTS=tryserver.chromium.linux:linux_optional_gpu_tests_rel;tryserver.chromium.mac:mac_optional_gpu_tests_rel;tryserver.chromium.win:win_optional_gpu_tests_rel Review-Url: https://codereview.chromium.org/1947983003 Cr-Commit-Position: refs/heads/master@{#391780} [modify] https://crrev.com/04a443352b363a4fdecee6c243ff5c02566a6795/gpu/command_buffer/common/capabilities.h [modify] https://crrev.com/04a443352b363a4fdecee6c243ff5c02566a6795/gpu/command_buffer/service/gles2_cmd_decoder.cc [modify] https://crrev.com/04a443352b363a4fdecee6c243ff5c02566a6795/gpu/config/gpu_driver_bug_list_json.cc [modify] https://crrev.com/04a443352b363a4fdecee6c243ff5c02566a6795/gpu/config/gpu_driver_bug_workaround_type.h [modify] https://crrev.com/04a443352b363a4fdecee6c243ff5c02566a6795/gpu/ipc/common/gpu_command_buffer_traits_multi.h
,
May 5 2016
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/824f8b72a0ecde812abe61c0582f0f57bd230d6e commit 824f8b72a0ecde812abe61c0582f0f57bd230d6e Author: erikchen <erikchen@chromium.org> Date: Thu May 05 21:31:45 2016 Clean up multisampled color mask workaround for DrawingBuffer. On old AMD GPUs, glColorMask() doesn't work correctly for multisampled renderbuffers. The old workaround used a GL_RGB renderbuffer, resolved to a GL_RGBA renderbuffer, and then blitted to the destination texture (GL_RGBA). The new workaround uses glColorMask() with a GL_RGBA renderbuffer. On GPUs where glColorMask() doesn't work correctly, the alpha channel of the destination texture is cleared to 1 after the resolve. On all other GPUs, no additional clear is necessary. BUG= 595948 Review-Url: https://codereview.chromium.org/1953623002 Cr-Commit-Position: refs/heads/master@{#391902} [modify] https://crrev.com/824f8b72a0ecde812abe61c0582f0f57bd230d6e/third_party/WebKit/Source/platform/graphics/gpu/DrawingBuffer.cpp [modify] https://crrev.com/824f8b72a0ecde812abe61c0582f0f57bd230d6e/third_party/WebKit/Source/platform/graphics/gpu/DrawingBuffer.h
,
Jun 21 2016
,
Jan 11 2017
,
Mar 27 2017
Re comment#27, apple claims to have actually fixed (!) this issue in 10.12.4 (GM) """ We believe this issue has been addressed in the latest macOS 10.12.4 GM release. Please test with the latest GM release. If you still have issues, please provide any relevant logs or information that could help us investigate. Please refer to the release notes for complete installation instructions. """ radar 25837054
,
Mar 27 2017
Re comment#27, apple claims to have actually fixed (!) this issue in 10.12.4 (GM) """ We believe this issue has been addressed in the latest macOS 10.12.4 GM release. Please test with the latest GM release. If you still have issues, please provide any relevant logs or information that could help us investigate. Please refer to the release notes for complete installation instructions. """ radar 25837054 |
||||||
►
Sign in to add a comment |
||||||
Comment 1 by erikc...@chromium.org
, Mar 18 2016