Issue metadata
Sign in to add a comment
|
WebGl vendor name returns Google SwiftShader, instead of returning actual vendor
Reported by
ahsaneja...@gmail.com,
May 8 2018
|
||||||||||||||||||||||
Issue descriptionUserAgent: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:52.0) Gecko/20100101 Firefox/52.0 Steps to reproduce the problem: Open the attached file. What is the expected behavior? WebGl vendor should return the GPU name. What went wrong? The vendor returns as Google SwiftShader. It should return the GPU name as well. Did this work before? Yes 65.0.3325.183 Does this work in other browsers? Yes Chrome version: 66.0.3359.139 (Official Build) (32-bit) Channel: stable OS Version: 10 Flash Version: Return back the real WebGl vendor. I use it to auto optimize the game quality settings for users according to the capability.
,
May 8 2018
I believe the real GPU vendor does not matter when SwiftShader is used, because it means the CPU is used instead of the GPU.
,
May 8 2018
GPU vendor helps determine the processor generation and this helps make adjustments. What is the reason in removing the vendor sub-property? Even if the CPU is used, the generation helps a lot is optimizing. This should be made open again as it breaks auto optimization algorithms.
,
May 8 2018
,
May 8 2018
The GPU vendor is not correlated to the CPU processor used in that system. I'm not sure what you could optimize based on the unused GPU vendor information, and that optimization would most likely be wrong. SwiftShader's CPU requirements are fairly low and it runs with somewhat similar performance on most systems. You would get much better results optimizing specifically for SwiftShader when you receive that vendor string, so getting the real GPU vendor would be worst, for the purpose of optimizing your game. This is working as intended and if you use this information, you'll get better performance in your game. Closing since this is working as intended.
,
May 8 2018
Agreed, the GPU info is not useful here. Reporter: Just to clarify, SwiftShader is a software renderer. It does not use the GPU hardware at all. If you're getting SwiftShader as your vendor string, you have no access to GPU hardware.
,
May 8 2018
The more interesting question to ask is why your machine fallback to SwiftShader. If you use Chrome Canary and take a look at about:gpu page there, you will have your answer.
,
May 8 2018
Just read more about the SwiftShader. Seems like it maps graphic API's to cpu. I have misunderstood what the SwiftShader is. However, I have a high end GPU, and this should not have been happening. I'll inform you guys, after figuring out what causes it, as of now there is no reliable way to reproduce it consistently. The behavior randomly changes at restarting Chrome.
,
May 9 2018
#9 - one more thing that may also help, is navigator.hardwareConcurrency which tells you how many logical processors (or is it cores?) the CPU has. I am not sure whether SwiftShader can will actually use multiple logical processors/cores, though.
,
May 10 2018
SwiftShader does use multithreading for rendering, but we limit the number of threads to 16 currently, so most CPUs can easily handle it. |
|||||||||||||||||||||||
►
Sign in to add a comment |
|||||||||||||||||||||||
Comment 1 by ahsaneja...@gmail.com
, May 8 20181.2 KB
1.2 KB View Download