You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Assume that I have a buffer of type Bytes.t for which the data I would like to decode are at some offset o. Is there a way to decode the bytes without doing an extra allocation (via Bytes.sub for example)?
The interface seems to get a decoder only from an entire buffer of type Bytes.t or String.t. Did I miss something?
I have the impression that a similar remark could also be made when encoding values.
Looking at the code source quickly, I have the feeling it would be quite easy to implement since the internal representation of a buffer already has this notion of offset.
The text was updated successfully, but these errors were encountered:
On paper we could have an option to represent the value as a `bytes * int * int` (just like there's an option to use, say, int instead of int32). Then encoding is still trivial (a `Bytes.blit`) and decoding offers a slice of the decoded string (so maybe you get a slice of string, not bytes, and it's on the user to use `Bytes.unsafe_to_string` where appropriate).
I think it makes sense! PR welcome if you have the time.
Assume that I have a buffer of type
Bytes.t
for which the data I would like to decode are at some offseto
. Is there a way to decode the bytes without doing an extra allocation (viaBytes.sub
for example)?The interface seems to get a decoder only from an entire buffer of type
Bytes.t
orString.t
. Did I miss something?I have the impression that a similar remark could also be made when encoding values.
Looking at the code source quickly, I have the feeling it would be quite easy to implement since the internal representation of a buffer already has this notion of offset.
The text was updated successfully, but these errors were encountered: