-
Notifications
You must be signed in to change notification settings - Fork 529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: http caching #3562
base: main
Are you sure you want to change the base?
feat: http caching #3562
Conversation
Would it maybe be better to use jsdoc types rather than separate type definition files? @mcollina |
lib/interceptor/cache.js
Outdated
// Dump body | ||
opts.body?.on('error', () => {}).resume() | ||
|
||
const result = globalOpts.store.get(opts) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wouldn't add support just yet for Promises
, maybe keep it sync for now and see if needed later
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @mcollina wdyt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We really should support promises from the get-go, we should consider adding a fs based one using promises too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Was having double-thoughts on this. Most of the APIs are callback based, shall we do something similar?
e.g. retry
allows async ops by passing a callback that will trigger (or cancel) the next retry
I think using type definitions are in line with the rest of the codebase, and therefore the correct implementation. If we want to migrate to a jsdoc-generated world, let's discuss in another issue! |
return this.#handler.onData(chunk) | ||
} | ||
|
||
onComplete (rawTrailers) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should also invalidate cache upon successful status code for non-safe methods (e.g. PUT
, POST
, PATCH
)
https://www.rfc-editor.org/rfc/rfc9111.html#name-invalidating-stored-respons
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Trying to think on a good way to be able to tell the cache store to purge all of the values for a certain host, wdyt of just something like store.deleteAllFromHost('localhost')
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that could work; I imagine will require to change the way we store the data?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm I think this also means we need to always run this interceptor for unsafe methods, and the methods
property in the options can just detail what safe methods the user wants to cache. Wdyt?
e07e3ec
to
938fa7a
Compare
Implements bare-bones http caching as per rfc9111 Closes nodejs#3231 Closes nodejs#2760 Closes nodejs#2256 Closes nodejs#1146 Co-authored-by: Carlos Fuentes <[email protected]> Co-authored-by: Robert Nagy <[email protected]> Co-authored-by: Isak Törnros <[email protected]> Signed-off-by: flakey5 <[email protected]>
938fa7a
to
f69a8b2
Compare
Signed-off-by: flakey5 <[email protected]>
Signed-off-by: flakey5 <[email protected]>
Signed-off-by: flakey5 <[email protected]>
Co-authored-by: Robert Nagy <[email protected]>
Co-authored-by: Carlos Fuentes <[email protected]>
Signed-off-by: flakey5 <[email protected]>
Signed-off-by: flakey5 <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
headers could be null or undefined
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ronag
Should we use fastTimers?
const nextPart = directives[j] | ||
if (nextPart.endsWith('"')) { | ||
foundEndingQuote = true | ||
headers.push(...directives.splice(i + 1, j - 1).map(header => header.trim())) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
horrible! why??
I cheated by using vscode to make the review of the files in lib by adding |
Co-authored-by: Aras Abbasi <[email protected]>
Co-authored-by: Aras Abbasi <[email protected]>
Signed-off-by: flakey5 <[email protected]>
Please revert the fastTimer changes for now. |
AHA! In case of the interceptor, you could instead of calling this.#handler.onError you could call super.onError ?! |
Signed-off-by: flakey5 <[email protected]>
if (typeof this.#handler.onError === 'function') { | ||
this.#handler.onError(err) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this would always be set
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The types allow it to be undefined however?
Line 220 in 0ad6007
onError?(err: Error): void; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, not sure why's that but I haven't find that scenario at all
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since it's allowed in the type I think we should check for it still
return true | ||
} | ||
|
||
return this.#handler.onHeaders( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the function would always be there as marks the kick-off for processing the response; shouldn't be possible to find a non onHeaders
(the whole process will be wrong if not provided). Same to handler.onError
Co-authored-by: Aras Abbasi <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should change the cache store interface to:
interace CacheStore {
createWriteStream(key, props)
createReadStrream(key)
}
and move the max entry size into the cache store. That way we can store any size of request without necessarily buffering through memory.
@mcollina wdyt?
Something like:
class CacheHandler extends DecoratorHandler {
onHeaders(statusCode, rawHeaders, resume, statusMessage, headers) {
this.#value = this.#store
.createWriteStream({ statusCode, rawHeaders, statusMessage, key: this.#key, headers })
.on('drain', resume)
.on('error', () => {
// TODO
})
return this.#handler.onHeaders(statusCode, rawHeaders, resume, statusMessage, headers)
}
onData(chunk) {
return this.#handler.onData(chunk) && this.#value.write(chunk)
}
onComplete(rawTrailers) {
this.#value.rawTrailers = rawTrailers
this.#value.end()
return this.#handler.onComplete(rawTrailers)
}
onError(err) {
this.#value.destroy(err)
return this.#handler.onError(err
}
}
Signed-off-by: flakey5 <[email protected]>
I'd be in favor of this - I think this would also work a lot better than what I was thinking for reusing responses too (https://www.rfc-editor.org/rfc/rfc9111.html#section-4-7). If a cache store was writing to some async non-memory store this should also work great still |
The max entry size is already in the cache store though? Or unless you were talking about where we check the response size in the |
Signed-off-by: flakey5 <[email protected]>
* @param {number} size | ||
*/ | ||
_read (size) { | ||
// TODO what to do if we get a read but we're still waiting on more chunks? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure what we could do here
Implements bare-bones opt-in http caching as per rfc9111. Bare-bones in this case means what's required by the spec and a few extra bits, with more coming in future prs.
Opening as a draft since there's still some more work to be done (mostly tests, but a bit more functionality-wise)No request cache directives are supported at this time, this will come later.
Response caching directives supported:
public
private
s-maxage
max-age
Expires
headerno-cache
no-store
stale-while-revalidate
This relates to...
Closes #3231
Closes #2760
Closes #2256
Closes #1146
Changes
Features
Bug Fixes
n/a
Breaking Changes and Deprecations
n/a
Status