New issue
Advanced search Search tips
Note: Color blocks (like or ) mean that a user may not be available. Tooltip shows the reason.
Starred by 23 users
Status: Available
Owner: ----
EstimatedDays: ----
NextAction: ----
OS: Android , Windows , Chrome , Mac
Pri: 3
Type: Feature

Sign in to add a comment
Support <img src="*.mp4">
Project Member Reported by, Dec 4 Back to list
Chrome Version: 62.0.3202.94 

What steps will reproduce the problem? 

(1) Attempt to supply a video `src` to the ` <img>` tag 
(2) Chrome will fetch the resource but does not support video supplied as an `<img src>`
(3) What is the expected result? We're able to interpret and correctly render video sources supplied to the `<img>` tag. 

What happens instead? 

This is unsupported in Chrome at present. Safari Tech Preview has implemented support for `<img src="animation.mp4">` with some details in 

Notes from the post: 

- GIFs are awesome but terrible for quality and performance 
- Replacing GIFs with `<video>` is better but has perf. drawbacks: not preloaded, uses range requests 
- Now you can `<img src=".mp4">`s in Safari Technology Preview 
- Early results show mp4s in `<img>` tags display 20x faster and decode 7x faster than the GIF equivalent – in addition to being 1/14th the file size!
- Background CSS video & Responsive Video can now be a “thing”. 
- Finally cinemagraphs without the downsides of GIFs!
- The above post is 46MB on Chrome but 2MB in Safari TP

Developers appear interested in this feature:

- Author of ImageOptim:
627 KB View Download
Components: Internals>Images>Codecs
Status: Available
Where does mp4 live from a decoding standpoint? Could Blink just plumb it in as a multi-frame image source and let the decoders handle the rest?
Labels: -Type-Bug Type-Feature
... why? I thought we discussed this with Apple and agreed that <video muted> was sufficient for all use cases. Also, what are they doing about the async nature of <video> vs <img>?
Components: Internals>Media>Video
The attached post details the use-cases:
* Optimization services can transcode animated GIFs to more efficient video formats without modifying the HTML.
* `<img>` is discovered earlier and downloads in a more efficient manner than video.
* Animated background images are becoming a thing. Videos in image contexts solves that without CSS changes.

I previously presented & discussed this at BlinkOn 7. I failed to find someone that knows the systems involved end-to-end though.
#1 is reasonable; #2 and #3 are questionable. E.g.,
- Why couldn't #2 be true for <video>?
- Sites are already using <video> for background.

As far as who knows what, for the <video> side of things, that would be mlamouri@ and I. The <img> side isn't well known to us though.

I do know img has lots of idiosyncrasies around synchronous loading that video does not provide. I.e., what's the fallback path for decode errors or selection of video/mp4 with unsupported codecs? For video this is an asynchronous process. What's the guarantee for copying to canvas?

Mounir and I are happy to chat with someone from the <img> side of things to at least flesh out the technicalities here.
Cc: is probably relevant to the sync/async issue here. I.e., video in img would likely need to be async always.
I'm not fully convinced that #1 is a strong long term use case as we should expect people to drop GIFs for <video muted>. It's only useful for websites that want to keep animated images and have a optimisation service that knows better.

I find the CSS use cases more interesting (as in allowing `background-image: foo.webm`).

FWIW, most of the points in the blog post would apply to <video muted>.
#1 - Do we have usage stats for animated GIFs? Has usage significantly decreased since the introduction of `<video muted>` support? I suspect the long tail is long and hard to deal with due to legacy software.

#2 - preloading is not necessarily appropriate for videos, but is appropriate for cinemagraphs. For range requests, we'd also need to distinguish cinemagraphs from long-form videos. Can we assume that all muted videos are short cinemagraphs?

#3 - Sites are using `<video>` to emulate background, but it's still part of the content, which is clumsy and probably doesn't support all the use cases people want to target.

I'm happy to discuss the `<img>` side of things. It's been a while since I touched that code, so would probably be good to have more recent contributors in the discussion as well.

Re. #1:

Lots of services support embedded images but not embedded videos. Big example: Gmail* (and other mail clients). It's not just for optimizing proxies: every hotlinked gif from Imgur, Giphy, Gfycat, etc. could immediately start being served as mp4/webm if Chrome sends the right Accept header.

I investigated this a while back and for random images on a few gif hosting sites, the videos they (already) serve on-site are 12-18x smaller than the hotlinkable gifs. The bandwidth saved here could be incredible on both sides.

* Gmail might require a server-side change because it proxies images, but you get the idea: a _lot_ more websites allow, and currently have, hotlinked gifs than allow hotlinked videos with the right set of attributes. (I think you need <video autoplay loop muted src=…> to get the same behavior, right? That's significantly weirder/less likely for every site to get right vs. <img src=…>.)
> Lots of services support embedded images but not embedded videos

As you pointed out, they will need to make changes to accept videos at which point, they could as well use <video>. Using <video> will offer benefits like being able to play/pause.

> Imgur, Giphy, Gfycat, etc

"GIF" websites have all been serving videos for a long time. They might even sometimes ask. Twitter is another example. A lot of these used to use GIF on mobile and most switched to <video muted> after it was autoplay'able on mobile.

> I think you need <video autoplay loop muted src=…> to get the same behavior, right?

If you want literally the same behaviour as a GIF, yes. If you just want to be allowed to play it, <video muted> is enough or <video> without an audio track will work on Safari (and soon in Chrome).

Using <video> over <img> offers more features and more control to the website and I'm not sure there is a compelling argument for a website to use <img> over <video> if they had to make the decision today. It's great to see that Apple is allowing to play videos in <img> and I suggest we see how that goes. If they see strong success maybe it means it's something worth doing but otherwise I'm not sure we should offer a footgun.
I think Apple introduced this feature for their HEIF format which is basically MP4 video.

Why not forget WEBP and jump forward into the world where every image interchangeable with 1 (or more) frame H.264/WebM/H.265 video?

Also picture present timing is not critically important for those use cases (don't require async).

Additionally, <img src=mp4> offers a hardware decoding. Since most mobile devices are equiped with h264 hardware decode, this would allow the image decode and resize to move off the cpu and improve battery and cpu.

This would be applicable for single frame h264 (and hevc) and multi frame.

A quick example of single frame mp4 on Safari Tech Preview:
Minor correction for compatibility on ios, I need playsinline: 
<video playsinline autoplay loop muted src=…>

It's clear that the use case for Cinemagraphs is very different than video. The user engagement is different: For example, a videos shouldn't restart if you scroll the video out of the viewport and then back because it was unloaded for memory concerns, where as a cinemagraph could without worry. Same thing applies for lazy loading, responsive images (srcset and sizes). For the developer, using <video> is not as natural or convenient for cinemagraph use cases.
+1 to the comment that Apple did this to support HEIF; that was my conclusion as well.

@colin: We already use hardware accelerated JPEG decoding on supported platforms, so <img> already benefits from that in some cases.
re #20, dalecurtis@, I'm curious where the hardware path for JPEG decoding is. I wasn't aware that we had this path, and would like to look into it a bit.
is there a way we could compare the pervasiveness of h264 hardware decode v. jpeg hardware decode? I thought only a very few vendors actually did jpeg hardware decode...
@ericrk, - I just know of it, not how it's used or where though.

@colin, I don't expect it's as pervasive as H264 decoding, so you're correct that large images would benefit from having hardware decoding more regularly with 1-frame h264/vp9 files.
colin@, I'm not sure hardware decoding is an argument for enabling video in <img> as one can simply use <video> for this.
HW decode is an argument for third party services that want to use h264 (single frame) for images where the content is clearly not <video>. This is a flywheel/webp or cdn like play but leverages hw decoding for images without mutating markup. 

To be clear, this is adjacent to the cinemagraph use case where the apng/agif would be served as a multi-frame h264 video.
Sign in to add a comment