Experimental `imageSmoothingQuality` property doesn't provide expected quality
Reported by
hom...@gmail.com,
Jun 8 2016
|
||||||||
Issue descriptionUserAgent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.84 Safari/537.36 Example URL: http://jsbin.com/sofigi/edit?output Steps to reproduce the problem: 1. Go to chrome://flags and enable "Experimental canvas features". chrome://flags/#enable-experimental-canvas-features 2. Open the test page http://jsbin.com/sofigi/edit?output 3. Choose a relatively large image (1024+ pixels width). For example, https://ucarecdn.com/0475ad99-b98e-4bee-bfd2-1212aa0c85ed/photo1447706140685aac3d804acf4.jpeg 4. Move the slider and look at the third canvas (imageSmoothingQuality = 'high'). It quality will be changed over different slider values and sometimes the image will be sharper, sometimes blurry. What is the expected behavior? From the begining the image resizing quality using canvas `context.drawImage` was awful. All known browsers were using very cheap and extremely poor resampling based on 4 source points. New Canvas API defines option to improve the quality: imageSmoothingQuality. It can take value "low" (the default), "medium" or "high". Finally, developers got the tool for high-quality image resampling on the client. And currently, at least Safary does implement high-quality image resampling based on convolutions with the new API. What went wrong? Chrome still doesn't use high-quality resampling even with imageSmoothingQuality set to "high". Instead, Chrome is resampling using mip-maps and old "4 source points" methods. The quality of the resulting image is significantly better than using only "4 source points" methods (old behaviour), but it still is not the "high". The truth that the more or less "high quality" could be achieved by series of resizes without `imageSmoothingQuality`: http://stackoverflow.com/a/17862644/253146 This will be not true high-quality convolution-based resampling, but at least it doesn't look too blurry as mip-maps. Look at the attachment with comparison of three methods: Chrome with imageSmoothingQuality = "high", Safari 9.1 with imageSmoothingQuality = "high" and Chrome with series of imageSmoothingQuality = "low". Does it occur on multiple sites: Yes Is it a problem with a plugin? No Did this work before? Yes Before imageSmoothingQuality was implemented :) Does this work in other browsers? Yes Chrome version: 51.0.2704.84 Channel: stable OS Version: OS X 10.11.5 Flash Version: Shockwave Flash 21.0 r0 I believe this is truly regression because before imageSmoothingQuality was implemented I was able to use hacks to achieve more or less high quality of resampling. If this feature stabilised, I will be forced to detect browser by a user agent to switch to the old code. Please don't implement this feature this way just because you need to cover the specs. People asks for `imageSmoothingQuality` for a reason, not just because they want a bit more quality images. I'm using image resampling on the client not just to show the image to the user. Instead, I'm resizing images before uploading to the server to reduce user's time and traffic costs. You can leave mip-maps + "4 source points" method for the "medium" value, but please implement high-quality resampling for imageSmoothingQuality = "high".
,
Jun 8 2016
Source files
,
Jun 8 2016
definitely a bug.
,
Jun 8 2016
,
Jun 8 2016
Something is wrong with mipmap quality. These examples show both aliasing and over-blurring in downsampled images. Properly implemented mipmaps should solve these problems correctly. The problem is that Chrome relies on the graphics driver for mipmap generation, and the driver implementations don't always produce great results. brianosman@: Do you expect your alternate implementation of mipmap generation to solve this? If so can we enable it as broadly as possible?
,
Jun 8 2016
I just added a note to another bug about this, but: We made the new mipmap generation opt-in, due to performance regressions. Additionally, the new implementation currently uses the same strategy as our CPU mip-map builder, which is likely to produce over-blurring for many of these test images (it switches to a triangle filter when the input dimensions are odd). I'd like to revisit the filtering anyway (probably switch to using a windowed sinc filter), and if we're concerned about quality in these kinds of applications, that's definitely going to improve the results. As for enabling it broadly - that also depends on how we want to prioritize speed vs. quality... Are we willing to take a performance hit on some mobile GPUs (and even on some desktop GPUs for that matter) to get better quality (and consistent, predictable results across devices)?
,
Jun 9 2016
We could support two level of mipmap quality IMHO. Right now kMedium_SkFilterQuality and kHigh_SkFilterQuality behave exactly the same when downsampling. kHigh_SkFilterQuality should map to the highest quality option, regardless of speed. Also, now that the option is opt-in, I guess that probably means we need to trigger it on ChromeOS, where driver generated mipmaps are buggy? Can you point me to the setting I would use for that?
,
Jun 9 2016
The option is now exposed via GrContextOptions::fDoManualMipmapping. (As of https://skia.googlesource.com/skia/+/97e398d98928f9497063594ebe633efe2d0f4968)
,
Jun 14 2016
Dont use mipmap google
,
Jul 21 2016
homm86@ With the hardware I have at my disposal, I don't get results that look quite as bad as what you've posted. Could you attach the contents of your chrome://gpu page to this report so that we can see the details of your graphics configuration?
,
Jul 21 2016
junov@ Done. Please, make sure that: 1) You have enabled Experimental canvas features 2) You have all other settings by default 3) You are looking at the output images with the same sizes: 257×257 for the first image, 127×127 for the second, and 136×136 for the third
,
Sep 12 2016
Hi, any update on this bug? It's a Q3 OKR to fix it.
,
Sep 12 2016
It's not as simple as we thought, a lot of work is required to determine which GPUs and driver versions need the workaround. Basically we need to create a comprehensive entry in the gpu driver bug list to activate the workaround. Also we may want fall back to software rendering, or implement our own sampling shaders in cases where the GPU does not support tri-linear or anisotropic filtering. This is not going to be fixed in September.
,
Sep 12 2016
I have some further questions/observations: 1) I also can't seem to get results that match what homm86@ is seeing. With both CPU and GPU rasterization, I'm seeing pretty standard mipmap filtering results on the test page for both medium and high quality. 2) I think there is still some confusion. The proposed workaround in (http://stackoverflow.com/a/17862644/253146) *is* mipmapping. It's a slow convoluted implementation, but it's mipmapping. 3) Even if we were to enable the new manual mipmapping code in more places, it won't necessarily fix this problem. All that it does is ensure that the GPU mipmapping behavior matches the CPU behavior, and my gut is that the behavior we've landed on is not the best for quality. 4) I think that the correct fix (to get good quality on these kinds of extreme downsamples) is a better quality filter during mipmapping, for both CPU and GPU generation/rasterization.
,
Jan 23 2017
We're going through top-starred blink bugs to better understand their status and priority - this is currently #10. Justin, can you give us an update on where things are at? Is anyone expecting to work on this at all in Q1?
,
Feb 7 2017
We are ignoring the star count in the prioritization of this bug. According to information provided by the monorail infrastructure team, we have reasons to believe it got star-spammed. That said, we are still taking this issue seriously since it is a perfectly valid complaint. The main reason it has not been addressed yet is that the resolution is highly complex given the limitations of the GLES2 interface that we use internally for GPU rendering, and our desire to avoid regressing performance.
,
Feb 8 2017
> our desire to avoid regressing performance But this property is not about performance, it's about quality. It's a way for a developer to say to the browser "I need good quality, performance is less important in this case".
,
Feb 8 2017
Right, but we are looking to degrade performance within reason. For example, doing a synchronous GPU texture readback and re-upload in order to resample the image on the CPU using a high quality filter would not be extremely hard to implement, but it would be flirting with the limits of what is unreasonable (mostly because of the synchronous readback). fserb@: Do you have the bandwidth to investigate possible solutions? Would be nice to use a lanczos filter for mipmap generation.
,
May 23 2017
@fserb: The chromium GPU command buffer now has a driver bug workaround that implements manual mip map generation for sRGB textures. it might be possible to fix this bug by enabling the workaround on RGB textures as well, on platforms that have sub-par filtering quality.
,
Jun 2 2017
,
Nov 15 2017
,
Jul 25
,
Jul 30
|
||||||||
►
Sign in to add a comment |
||||||||
Comment 1 by schenney@chromium.org
, Jun 8 2016