Chrome incorrectly detecting hardware WebGL 2.0 on Android Emulator with ES 2.0 |
||
Issue descriptionWhen Chrome (for X86 Android) is run inside the Android Emulator, and the emulator is configured with ES 2.0, it's incorrectly detecting support for WebGL 2.0 rather than only 1.0. It's not using Swiftshader, either - some code is thinking that it's OK to enable hardware WebGL 2.0 in GetWebGL2FeatureStatus in src/gpu/config/gpu_util.cc .
,
Jan 17
(5 days ago)
zmo points out: The checks start with is_es3_capable in src/ui/gl/gl_version_info.h Then this is part of the gpu_feature_info. Checked by ContextGroup and everything else. In the blacklisting, we explicitly disable WebGL 2.0, but if it isn't disabled there, the about:gpu page (in src/content/browser/gpu/gpu_internals_ui.cc ) doesn't know whether it will really work. The reporting is just incorrect here. There is a separate and more severe problem where attempting to fetch a WebGL 2.0 context on this configuration crashes the GPU process, and I'm going to investigate why first.
,
Jan 17
(5 days ago)
This is where the GPU process crashes, and explains why the Android emulator team was seeing an incorrect use of GL_TEXTURE_3D with ES 2.0 contexts: Stack Trace: RELADDR FUNCTION FILE:LINE 00000000 <unknown> 03d8d091 gl::GLApiBase::glTexImage3DFn(unsigned int, int, int, int, int, int, int, unsigned int, unsigned int, void const*) ../../ui/gl/gl_bindings_autogen_gl.cc:5252:3 v------> base::ThreadLocalPointer<gl::CurrentGL>::Get() ../../base/threading/thread_local.h:64:37 041a3b0e gpu::gles2::TextureManager::CreateDefaultAndBlackTextures(unsigned int, unsigned int*) ../../gpu/command_buffer/service/texture_manager.cc:2124:0 v------> std::__1::vector<gpu::gles2::FramebufferManager*, std::__1::allocator<gpu::gles2::FramebufferManager*> >::push_back(gpu::gles2::FramebufferManager* const&) ../../buildtools/third_party/libc++/trunk/include/vector:1634:9 041a3738 gpu::gles2::TextureManager::AddFramebufferManager(gpu::gles2::FramebufferManager*) ../../gpu/command_buffer/service/texture_manager.cc:2046:0 v------> gpu::gles2::ContextGroup::CheckGLFeatureU(int, unsigned int*) ../../gpu/command_buffer/service/context_group.cc:653:17 040c99c9 gpu::gles2::ContextGroup::Initialize(gpu::DecoderContext*, gpu::ContextType, gpu::gles2::DisallowedFeatures const&) ../../gpu/command_buffer/service/context_group.cc:422:0 v------> std::__1::enable_if<(is_move_constructible<gpu::gles2::ContextGroup*>::value) && (is_move_assignable<gpu::gles2::ContextGroup*>::value), void>::type std::__1::swap<gpu::gles2::ContextGroup*>(gpu::gles2::ContextGroup*&, gpu::gles2::ContextGroup*&) ../../buildtools/third_party/libc++/trunk/include/type_traits:4519:9 v------> scoped_refptr<gpu::gles2::ContextGroup>::swap(scoped_refptr<gpu::gles2::ContextGroup>&) ../../base/memory/scoped_refptr.h:236:0 v------> scoped_refptr<gpu::gles2::ContextGroup>::operator=(scoped_refptr<gpu::gles2::ContextGroup>) ../../base/memory/scoped_refptr.h:228:0 v------> scoped_refptr<gpu::gles2::ContextGroup>::operator=(gpu::gles2::ContextGroup*) ../../base/memory/scoped_refptr.h:224:0 0410af96 gpu::gles2::GLES2DecoderImpl::Initialize(scoped_refptr<gl::GLSurface> const&, scoped_refptr<gl::GLContext> const&, bool, gpu::gles2::DisallowedFeatures const&, gpu::ContextCreationAttribs const&) ../../gpu/command_buffer/service/gles2_cmd_decoder.cc:3518:0 v------> scoped_refptr<gl::GLContext>::operator bool() const ../../base/memory/scoped_refptr.h:238:43 042122e6 gpu::GLES2CommandBufferStub::Initialize(gpu::CommandBufferStub*, GPUCreateCommandBufferConfig const&, base::UnsafeSharedMemoryRegion) ../../gpu/ipc/service/gles2_command_buffer_stub.cc:302:0 042143e5 gpu::GpuChannel::OnCreateCommandBuffer(GPUCreateCommandBufferConfig const&, int, base::UnsafeSharedMemoryRegion, gpu::ContextResult*, gpu::Capabilities*) ../../gpu/ipc/service/gpu_channel.cc:643:5
,
Jan 17
(5 days ago)
This is unexpected. Looking at TextureManager::Initialize(), we only create default 3D textures if feature_info_->IsWebGL2OrES3Context() is true. To my knowledge, we only create WebGL2 or ES3 context if GLVersionInfo.is_es3_capable is true. So somewhere along these checks there is a bug.
,
Jan 18
(5 days ago)
The initialization logic of FeatureInfo and ContextGroup is contorted, and essentially we're allowing a request for a WebGL 2.0 context to go through even though FeatureInfo knows that the context isn't ES 3.0 capable.
For the record, while trying to write a unit test for this, I ran into the following crash. With the current framework, it's too difficult to write a unit test for the scenario where the decoder's initialization fails.
Program received signal SIGSEGV, Segmentation fault.
0x0000000000000000 in ?? ()
(gdb) bt
#0 0x0000000000000000 in ?? ()
#1 0x000055555695ef59 in gl::GLApiBase::glTexImage3DFn(unsigned int, int, int, int, int, int, int, unsigned int, unsigned int, void const*) () at ../../ui/gl/gl_bindings_autogen_gl.cc:5252
#2 0x0000555556cd357e in CreateDefaultAndBlackTextures () at ../../gpu/command_buffer/service/texture_manager.cc:2129
#3 0x0000555556cd318a in Initialize () at ../../gpu/command_buffer/service/texture_manager.cc:2077
#4 0x0000555556bfeaa8 in Initialize () at ../../gpu/command_buffer/service/context_group.cc:511
#5 0x0000555555def7bb in InitDecoderWithWorkarounds ()
at ../../gpu/command_buffer/service/gles2_cmd_decoder_unittest_base.cc:252
#6 0x0000555555deeb20 in gpu::gles2::GLES2DecoderTestBase::InitDecoder(gpu::gles2::GLES2DecoderTestBase::InitState const&) () at ../../gpu/command_buffer/service/gles2_cmd_decoder_unittest_base.cc:189
#7 0x0000555555d51090 in SetUp () at ../../gpu/command_buffer/service/gles2_cmd_decoder_unittest.cc:1848
#8 0x000055555647b830 in HandleExceptionsInMethodIfSupported<testing::Test, void> ()
at ../../third_party/googletest/src/googletest/src/gtest-internal-inl.h:934
#9 Run () at ../../third_party/googletest/src/googletest/src/gtest.cc:2517
#10 0x000055555647c52f in Run () at ../../third_party/googletest/src/googletest/src/gtest.cc:2703
#11 0x000055555647ca57 in Run () at ../../third_party/googletest/src/googletest/src/gtest.cc:2825
#12 0x0000555556488e17 in RunAllTests () at ../../third_party/googletest/src/googletest/src/gtest.cc:5227
#13 0x0000555556488996 in HandleExceptionsInMethodIfSupported<testing::internal::UnitTestImpl, bool> ()
at ../../third_party/googletest/src/googletest/src/gtest-internal-inl.h:934
#14 Run () at ../../third_party/googletest/src/googletest/src/gtest.cc:4835
#15 0x000055555669a209 in RUN_ALL_TESTS () at ../../third_party/googletest/src/googletest/include/gtest/gtest.h:2369
#16 Run () at ../../base/test/test_suite.cc:294
#17 0x000055555669c31d in Run () at ../../base/callback.h:99
#18 LaunchUnitTestsInternal () at ../../base/test/launcher/unit_test_launcher.cc:225
#19 0x000055555669c1ba in LaunchUnitTests () at ../../base/test/launcher/unit_test_launcher.cc:575
#20 0x0000555555c98cc5 in main () at ../../gpu/command_buffer/common/unittest_main.cc:37
,
Jan 18
(4 days ago)
Fix in progress at https://chromium-review.googlesource.com/1419340 . |
||
►
Sign in to add a comment |
||
Comment 1 by zmo@google.com
, Jan 17 (5 days ago)