Give nice control over the size of chunks to be read for handling large files (possibly by BYOBReader) |
||||
Issue descriptionhttp://output.jsbin.com/cekequd/quiet/ Give the 2nd input a large file & watch the console. Chrome Canary completes in 7 seconds for a 900mb file, but Safari Technology Preview completes in 1.2s for the same file. For Firefox, you have to use the first input which uses the FileReader API, but it completes in 500ms for the same file.
,
May 12 2017
Oh, I updated the demo, so I mean the "Fetch & streams" input. FileReader is super-broken if you use progress events. It seems to create a new array buffer for each instance of progress, so it churns through a ton of memory then crashes https://bugs.chromium.org/p/chromium/issues/detail?id=674903. Without progress events, FileReader is faster than fetch + streams (4.7s on this machine vs 6.1s), but I guess BYOBReader would be a fairer comparison here.
,
May 12 2017
Also lol "steam". Oops.
,
Jun 2 2017
Yeah. FileReader.readAsArrayBuffer() is still best for performance if it's ok to read the whole contents. As fetch()+Streams on Chrome reads a file in 32k chunks, it would produce 32k 32k-chunks which should be heavy. Yes, the BYOBReader would help by giving control over the size of chunks.
,
Jun 4 2018
This issue has been Available for over a year. If it's no longer important or seems unlikely to be fixed, please consider closing it out. If it is important, please re-triage the issue. Sorry for the inconvenience if the bug really should have been left as Available. For more details visit https://www.chromium.org/issue-tracking/autotriage - Your friendly Sheriffbot
,
Jun 5 2018
|
||||
►
Sign in to add a comment |
||||
Comment 1 by ricea@chromium.org
, May 8 2017Summary: Reading a large file as a stream is a little slow (was: Reading a large file as a steam is a little slow)