-
-
Notifications
You must be signed in to change notification settings - Fork 133
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: use compression streams to decompress responses #661
feat: use compression streams to decompress responses #661
Conversation
@kettanaito you might be interested in https://www.npmjs.com/package/http-encoding. I'm using this to do something very similar - using it both in Node and in a webpack-bundled web app (decoding, editing & re-encoding data within HTTP Toolkit). In my case for unrelated reasons I'm not really worried about streaming here, but it would be a very reasonable addition. That package supports Brotli (via brotli-wasm) and also zstd (rapidly growing in real usage & client support, not supported by compression streams API AFAIK) and base64 (non-standard & weird, but surprisingly common IME), and does things like unwrapping multiple chained encodings automatically and dealing with various common-but-technically-incorrect encodings like Depends on which direction you're going and whether you need this inline here, but if you wanted to avoid duplicating effort and make this functionality more widely available to others, I'd very happily accept a PR to extend that package to support streaming too & to use the compression streams API where available. |
Hi, @pimterry. Thank you for sharing your insights. The The standard Compression Streams API gives us 3 out of 4 encodings we need. I would love to utilize that! That's why I'm primarily looking for the ways to use I believe if you find httptoolkit/brotli-wasm#38 as beneficial, that should solve the blocker to adopt |
Not ESM yet, but PRs welcome. Personally I'm mainly focus on the HTTP Toolkit use case, where my main audience is the application users not library users, so ESM hasn't been top of my todo list, but I do agree it's the right time to move over. The http-encoding package itself isn't doing complicated at all, so ESM that shouldn't be a problem for it in isolation. It's just parsing content-encoding headers and pulling together relevant packages/Node APIs with a single consistent cross-platform entrypoints. The main challenge would be ESM-ing brotli-wasm (thanks for your help there) and any changes require for ESM compatibility in zstd-codec (I don't know if anything is required there, and not my package, but the maintainer has been very helpful & responsive in the past). Note that HTTP encodings are still an evolving space, so I wouldn't assume you'll be "done" once you have the small set you're looking at above. Zstandard is genuinely very neat and I'd expect it to become the most popular compression for dynamic content in the near future (much faster for good results, so Brotli for static data & Zstd for dynamic is really effective model, and it's hitting widespread client support now). And just this week Chrome has shipped https://www.debugbear.com/blog/shared-compression-dictionaries in Chrome stable, which defines more new HTTP compression formats on top of Brotli & Zstandard (compression with pre-shared context, e.g. sending just a file diff from the previous version, among many other use cases - this is going to offer huge boosts for many use cases in web & APIs). That has public support from FF & Safari too, and will require quite a more work once those start being widely used. |
I've split the decompression logic to be environment-dependent. The same fetch interceptor will load different Now that we have the environment separation, we can use |
Brotli decompression in the browserThe best course of action is to skip the Brotli decompression in the browser. There are two reasons for it. Reason 1: Low usageThe fetch interceptor is not intended to be used in the browser. MSW relies on the Service Worker, and the fetch interceptor is only used as a fallback mechanism if the worker API is not available. Thus, not providing full feature parity between Node.js and the browser is more than expected. Reason 2: WASMLoading a WASM is complicated. For example, here's an error I get from webpack in our test suite:
This means that I need to configure my compiler to understand that a certain module is supposed to be WASM. I can certainly do that, but I won't ask my users to. They will be faced with similar errors from their compilers, and it's everyone's least favorite chore to tweak these things. Conclusion
|
a5447cd
into
Michael/support-fetch-content-encoding
Uses the Compression Streams API to decompress encoded fetch responses.
Why
Using the Compression Streams API has some advantages:
zlib
-based approach;It does, however, have one significant downside:
We may consider investing effort into web stream-based Brotli compression. From what I've seen, it's usually a C package loaded via WASM. None of the packages I found are web stream-friendly though, we'd have to abstract on top of them, which isn't ideal (Node.js still helps a lot with that via
Readable.toWeb/fromWeb
). Loading a WASM also can impact performance.Blockers
brotli-wasm
, which is overall amazing, but has trouble initializating the WASM in Node.js due to two reasons:file
URL scheme (Provide a more descriptive error on fetching "file" scheme nodejs/undici#3741)brotli-wasm
attempts afile
fetch, to begin with, because it resolves theimport
export condition to the web build always: Incorrect "exports" in package.json httptoolkit/brotli-wasm#38