The new Material Design pages use Polymer[1]. This requires pulling in a bit of HTML/CSS/JS. We combine it with a tool named vulcanize[2], then extract the JavaScript with a tool named crisper[3], and lastly minify with uglify[4].
We do the best we can to reduce the size of the resources when loaded off disk, but we can do more.
Recently, smaier@ (intern) and agrieve@ adding gzip compression to on-disk resources (i.e. after being unpacked) mainly for internals pages on Android. I tweaked the implementation to use the same streaming gunzip that the net stack uses when decoding gzipped network resources. Then I enabled it on the old/mobile history page, and agrieve@ and smaier@ turned it on to a ton of internals pages.
We should try gzipping Material UIs on desktop for those with slow disks.
Right now one of the few impediments from this is that we have a $i18n{} preprocessing system that adds translations before sending to the renderer. When compression is turned on, this system scans through the compressed content (which is obviously wasteful and doesn't work). So, my plan is to write a streaming $i18n{} processing via net::FilterStream or whatever the modern equivalent is (which is in flux right now).
We'll monitor things like History.ResultsRenderedTime to see if this has a positive effect on UMA users load time.
The new Material Design pages use Polymer[1]. This requires pulling in a bit of HTML/CSS/JS. We combine it with a tool named vulcanize[2], then extract the JavaScript with a tool named crisper[3], and lastly minify with uglify[4]. For docs on this process, see here: https://chromium.googlesource.com/chromium/src/+/master/docs/vulcanize.md
We do the best we can to reduce the size of the resources when loaded off disk, but we can do more.
Recently, smaier@ (intern) and agrieve@ adding gzip compression to on-disk resources (i.e. after being unpacked) mainly for internals pages on Android. I tweaked the implementation to use the same streaming gunzip that the net stack uses when decoding gzipped network resources. Then I enabled it on the old/mobile history page, and agrieve@ and smaier@ turned it on to a ton of internals pages.
We should try gzipping Material UIs on desktop for those with slow disks.
Right now one of the few impediments from this is that we have a $i18n{} preprocessing system that adds translations before sending to the renderer. When compression is turned on, this system scans through the compressed content (which is obviously wasteful and doesn't work). So, my plan is to write a streaming $i18n{} processing via net::FilterStream or whatever the modern equivalent is (which is in flux right now).
We'll monitor things like History.ResultsRenderedTime to see if this has a positive effect on UMA users load time.
[1] https://www.polymer-project.org/1.0/
[2] https://github.com/Polymer/vulcanize
[3] https://github.com/PolymerLabs/crisper
[4] https://github.com/mishoo/UglifyJS
Comment 1 by dbeam@chromium.org
, Oct 6 2016