Texture internal format / format / type and texture completeness handling is fairly inconsistent for ES3 |
||||||
Issue descriptionSo, background first. In ES2, texture completeness was based on matching (internal_format, format, type) triplets. internal_format is always unsized, and always matches format. Depending on whether the level image was created by glTexImage2D, glCopyTexImage2D or glGenerateMipmap, format and type we not strictly defined, but with the format == internal_format constraint and squinting enough, we kind of could find a reasonable value for all 3 (in practice, glCopyTexImage2D picks format = internal_format, and type = GL_UNSIGNED_BYTE, and glGenerateMipmap picks the internal_format/format/type of the base level). Extensions against ES2 (e.g. GL_EXT_texture_format_BGRA8888, GL_EXT_texture_rg, GL_EXT_sRGB, GL_OES_texture_float/half_float) kept the format == internal_format constraint. GL_EXT_texture_storage introduced sized internal formats, but they were only accepted in glTexStorage*EXT, and were explained in terms of glTexImage2D using a derived format+type (and the passed internal_format gets replaced by the format). In ES3, the rules are changes in 3 significant ways: - internal_format does not have to match format any more - unsized internal_formats, as well as format and type become just an API construct, but are not a property of the level image any more. Instead an "effective internal format" was introduced, that is always sized, and is a property of the level image. There are specific rules about how to pick the effective internal format from an unsized internal_format, depending on the API call (e.g. based on format+type in glTexImage2D, based on the read framebuffer in glCopyTexImage2D, etc.). - texture completeness, and some other matching rules (e.g. in glCopyTexImage2D) are always based on the effective internal format What this means in the code is a bit unfortunate, mixing a bit of the ES2 logic and the ES3 logic. For example, completeness is still based on the ES2 rules (matching internal_format/format/type triplet), but the internal_format we keep in the level info is sometimes sized and sometimes unsized (depending on which API was used to create the level image). In some places we liberally use 'format' and 'internal_format' interchangeably, which made sense in ES2, but not in ES3 any more. In some places, we reconstruct some info about what the effective internal format should have been, based on format+type - but that assumes format and type are set correctly, which is non-obvious in some cases for ES3. https://codereview.chromium.org/2146603005/ fixes an example of a bug caused by this inconsistency, and the fix is not necesarily very satisfying. We can probably generate other cases where we behave inconsistently with ES3 rules - e.g. if you make 2 levels using glTexImage2D, with GL_RGBA/GL_UNSIGNED_BYTE as format/type, one level with an internal_format of GL_RGBA, and the second one with an internal_format of GL_RGBA8, we will probably reject the texture as incomplete, even though the ES3 rules would accept it as complete (because the effective internal format for GL_RGBA/GL_RGBA/GL_UNSIGNED_BYTE is GL_RGBA8). My suggestion is to rework everything to be primarily based on sized effective internal formats - even for ES2. I convinced myself that every valid format/type combination in ES2 uniquely corresponds to one sized internal format (although those may not be a valid input for the unextended API, e.g. GL_LUMINANCE8_ALPHA8), so the completeness rules would still be consistent (that was the intent for ES3 anyway). Basically, we would only store a single effective_internal_format in LevelInfo, making sure we pick a valid and consistent one (following the ES3 rules, including in the presence of extensions). We would still need in places a way to extract a valid format/type (e.g. when pre-creating texture levels with glTexImage2D for security or workarounds, or when calling the driver if it is ES2), but that can be uniquely done from sized internal formats (see TextureManager::ExtractFormatFromStorageFormat / TextureManager::ExtractTypeFromStorageFormat). This should also let us get rid of the GetTexInternalFormat magic in ui/gl/gl_gl_api_implementation.cc (which aims at reconstructing the effective internal format). This is a fair amount of work, but I think this will simplify the code in the end, and make it much more consistent with the spec(s).
,
Jul 14 2016
"In the case of CopyTexImage, we really need the source texture's original unsized format for validation purpose" why? From section "3.8.5 Alternate Texture Image Specification Commands": """ If internalformat is sized, the internal format of the new texel array is internalformat, and this is also the new texel array’s effective internal format. If the component sizes of internalformat do not exactly match the corresponding component sizes of the source buffer’s effective internal format, described below, an INVALID_OPERATION error is generated. If internalformat is unsized, the internal format of the new texel array is determined by the following rules, applied in order. If an effective internal format exists that has 1. the same component sizes as, 2. component sizes greater than or equal to, or 3. component sizes smaller than or equal to those of the source buffer’s effective internal format (for all matching components in internalformat), that format is chosen for the new image array, and this is also the new texel array’s effective internal format. When matching formats that involve a luminance component, a luminance component is considered to match with a red component. If multiple possible matches exist in the same rule, the one with the closest component sizes is chosen. Note that the above rules disallow matches where some components sizes are smaller and others are larger (such as RGB10_- A2). The effective internal format of the source buffer is determined with the following rules applied in order: • If the source buffer is a texture or renderbuffer that was created with a sized internal format then the effective internal format is the source buffer’s sized internal format. • If the source buffer is a texture that was created with an unsized base internal format, then the effective internal format is the source image array’s effective internal format, as specified by table 3.12, which is determined from the format and type that were used when the source image array was specified by TexImage*. • Otherwise the effective internal format is determined by the row in table 3.17 or table 3.18 where Destination Internal Format matches internalformat and where the Source Red Size, Source Green Size, Source Blue Size, and Source Alpha Size are consistent with the values of the source buffer’s FRAMEBUFFER_RED_SIZE, FRAMEBUFFER_GREEN_- SIZE, FRAMEBUFFER_BLUE_SIZE, and FRAMEBUFFER_ALPHA_SIZE, respectively. Table 3.17 is used if the FRAMEBUFFER_ATTACHMENT_- ENCODING is LINEAR and table 3.18 is used if the FRAMEBUFFER_- ATTACHMENT_ENCODING is SRGB. ”any sized” matches any specified sized internal format. ”N/A” means the source buffer’s component size is ignored. If there are no rows in the appropriate table, 3.17 or 3.18, that match the internalformat and source buffer component sizes, then the source buffer does not have an effective internal format, and an INVALID_OPERATION error is generated. """ As you can see, the original unsized format of the source doesn't matter - only the effective internal format.
,
Jul 14 2016
I see. Then we can totally discard the unsized format in internal representation. Thanks for clarifying this.
,
Nov 8 2016
,
Nov 8 2016
,
Feb 24 2017
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/b1424a997d92b500ccd797d4ee690246e40182b4 commit b1424a997d92b500ccd797d4ee690246e40182b4 Author: dongseong.hwang <dongseong.hwang@intel.com> Date: Fri Feb 24 02:22:31 2017 Set correct internalformat info for TexStorageEXT for WebGL1 context WebGL1 context must behave like ES2 context, which crrev.com/2458103002 defines. BUG= 535198 , 628064 CQ_INCLUDE_TRYBOTS=master.tryserver.chromium.linux:linux_optional_gpu_tests_rel;master.tryserver.chromium.mac:mac_optional_gpu_tests_rel;master.tryserver.chromium.win:win_optional_gpu_tests_rel Review-Url: https://codereview.chromium.org/2548733002 Cr-Commit-Position: refs/heads/master@{#452718} [modify] https://crrev.com/b1424a997d92b500ccd797d4ee690246e40182b4/gpu/command_buffer/service/gles2_cmd_decoder.cc [modify] https://crrev.com/b1424a997d92b500ccd797d4ee690246e40182b4/gpu/command_buffer/service/gles2_cmd_decoder_unittest_textures.cc
,
Mar 9 2018
This issue has been Available for over a year. If it's no longer important or seems unlikely to be fixed, please consider closing it out. If it is important, please re-triage the issue. Sorry for the inconvenience if the bug really should have been left as Available. If you change it back, also remove the "Hotlist-Recharge-Cold" label. For more details visit https://www.chromium.org/issue-tracking/autotriage - Your friendly Sheriffbot
,
Mar 9 2018
,
Nov 27
,
Jan 15
,
Jan 15
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/aa834208352f248055a3ae36167ca5453b02eabb commit aa834208352f248055a3ae36167ca5453b02eabb Author: James Darpinian <jdarpinian@chromium.org> Date: Tue Jan 15 07:59:51 2019 Fix format/internalformat confusion for ETC1 shared images. SharedImageBackingFactoryGLTexture was specifying GL_ETC1_RGB8_OES as the format for ETC1 textures. GL_ETC1_RGB8_OES is only valid as an internalformat; the format should be GL_RGB. The format was mostly ignored before but is now being checked after https://chromium-review.googlesource.com/c/chromium/src/+/1368832 This fixes an issue with tabs not drawing in the Android tab switcher. Bug: 920917, 628064 Change-Id: I25cda74d2a9e9b33d0755f7eda1309275b4fd79e Reviewed-on: https://chromium-review.googlesource.com/c/1410389 Reviewed-by: Antoine Labour <piman@chromium.org> Commit-Queue: James Darpinian <jdarpinian@chromium.org> Cr-Commit-Position: refs/heads/master@{#622772} [modify] https://crrev.com/aa834208352f248055a3ae36167ca5453b02eabb/components/viz/common/resources/resource_format_utils.cc [modify] https://crrev.com/aa834208352f248055a3ae36167ca5453b02eabb/gpu/command_buffer/service/shared_image_backing_factory_gl_texture.cc
,
Yesterday
(36 hours ago)
The following revision refers to this bug: https://chromium.googlesource.com/chromium/src.git/+/b270239f4ec4ab1a00332bdf3fda4bb5cb0eaecb commit b270239f4ec4ab1a00332bdf3fda4bb5cb0eaecb Author: James Darpinian <jdarpinian@chromium.org> Date: Tue Jan 22 20:56:19 2019 gpu: Fix format/internalformat confusion Several code paths in the command decoder were creating textures with format == internalformat. However, many internalformat values are not valid as format values. This was not checked before http://crrev.com/c/1368832 , but now it results in errors. This change calls TextureManager::ExtractFormatFromStorageFormat to get the correct format for the specified internalformat. Fixes video display in the YouTube Android app on some Chrome OS devices. Bug: 920967 , 809237 , 628064 Change-Id: I93c6449286094d895f45572fa0db5dbead1797ea Reviewed-on: https://chromium-review.googlesource.com/c/1425728 Commit-Queue: James Darpinian <jdarpinian@chromium.org> Reviewed-by: Antoine Labour <piman@chromium.org> Cr-Commit-Position: refs/heads/master@{#624907} [modify] https://crrev.com/b270239f4ec4ab1a00332bdf3fda4bb5cb0eaecb/gpu/command_buffer/service/gles2_cmd_decoder.cc [modify] https://crrev.com/b270239f4ec4ab1a00332bdf3fda4bb5cb0eaecb/gpu/command_buffer/service/shared_image_backing_factory_gl_texture.cc [modify] https://crrev.com/b270239f4ec4ab1a00332bdf3fda4bb5cb0eaecb/gpu/command_buffer/service/texture_manager.cc |
||||||
►
Sign in to add a comment |
||||||
Comment 1 by zmo@chromium.org
, Jul 14 2016