New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.

Issue 729183 link

Starred by 2 users

Issue metadata

Status: Available
Owner: ----
Cc:
Components:
EstimatedDays: ----
NextAction: ----
OS: ----
Pri: 3
Type: Feature



Sign in to add a comment

don't render/load heavily downsampled images

Project Member Reported by ojan@chromium.org, Jun 2 2017

Issue description

It's more common than is good for the web for sites to load very large images and show them on a tiny mobile screen. We should disallow this. It's bad for user experience. Decoding the images makes the page janky, storing the images in memory uses a lot of memory, and downloading the bytes of the image uses users data plans costing them real money.

We should not render large images that are excessively downsampled and show the user an infobar to show the images like we do for client-side lofi. As a followup, we can look at the Content-Length header and not even download images that are clearly too large for the space they are being put into (e.g. an 8MB image going into a 100x100 space).

First things first, we'll need UMA to measure what a reasonable cutoff would be. My intuition is something like 2x.
 
Labels: LowMemory
Labels: BugSource-Chromium PaintTeamTriaged-20170605
The solution presumably is to use srcset or other techniques for matching image size to media. Has there been an intervention proposal for this?
Labels: -Type-Bug Type-Feature
Cc: chrishtr@chromium.org
So the UMA (Renderer4.ImageDecodeMipLevel) is back with 2 weeks worth of data. And here is the breakdown we see on Android:

1) Miplevel 1 (0-2x) ~94.34%.
2) Miplevel 2 (2x-4x) ~4.59%.

Is this worth doing given the rare cases with excessive down-sampling? There is also an on-going effort to implement decoding directly to the scaled size (742586) and skipping the allocation for a decode at the original size, which should address the memory concerns better.
I don't think it's worth it for this infrequent of a problem.
1 in 20 images actually seems like a *huge* impact to me. Even if we only focus on the 1% of massive images it seems like a big deal. How can I help make this happen?

Comment 8 by vmp...@chromium.org, Aug 15 2017

Just to set expectations, I'm thinking that Miplevel 2 would _not_ be considered heavily downscaled, so really we're talking about the "everything else" case, which would be around 1.07% based on the numbers in #5 (still probably worth it, as you mention). Of course what we consider too downscaled or not is up for debate.

I think the main work here is to figure out the UI story. We need to somehow inform the user that this image is not there because of some flag or that we decided not to show it due to memory/efficiency/etc and allow some sort of a UI overlay to force this. 

Is there initial work here already, or should we start this up?
LoFi++ does something similar with showing placeholders for images and showing the user a UI to trigger fetching. May be that could be re-used here?
Cc: khushals...@chromium.org
Owner: ----
Status: Available (was: Assigned)
I won't be able to get to this anytime soon. Marking it available in case anyone else can take this up.
Re comment 7: there is an effort in progress to decode images to the scale
they will be displayed on screen (nearest power of 2 actually). This will
address rendering sources of memory waste due to rendering huge images at small scale. It will not affect whether *encoded* image sizes are huge and therefore
waste bandwidth or memory for storing them, though Khushal's data doesn't say
what that waste is. If it's considered a problem we will need new UMAs for it.

Decode to scale is tracked in  issue 558070 .

I'd like to propose closing this issue in favor of 558070. WDYT?
I think it makes sense to close this in favor of 558070.

Comment 13 by ojan@chromium.org, Aug 22 2017

Cc: rsch...@chromium.org dk...@chromium.org bengr@chromium.org
It might make sense to not do this from the memory perspective. But I think we still want it for data saving and responsiveness. Once we're not rendering them, we get a couple benefits:

1. Web authors need to make sites that are data-use friendly in terms of images.
2. We can avoid downloading these images entirely in many cases.
3. These large images impose a large responsiveness/animation cost as the image decode uses a lot of CPU and janks the raster.

Our goal should be to improve the 99.9% of bad user experiences so that the web doesn't have unexpected really bad experiences (e.g. I went to this site and half my data plan got used up). So, improving 1% of really egregious image loads seems quite significant to me.

I suppose we could render regardless of downsampling and just avoiding *downloading* of large images that are heavily downsampled, but I think that makes image loading confusing and unpredictable since there's sometimes a race with whether we know what size it would be when the download started.
I am wondering if this is a good fit as a data saver feature then -- a less severe version of Lo-Fi. If we assume that users that care about data are likely enabling data saver on their phones...
One additional comment: for the data saver aspect, we'd also need to new UMA data to
measure the impact. It's not clear from the data I know of how wasteful the encoded
image sizes are.
Cc: -rsch...@chromium.org

Sign in to add a comment