-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: validate uploaded parts before completing upload #13763
feat: validate uploaded parts before completing upload #13763
Conversation
@@ -8,16 +8,9 @@ | |||
export const byteLength = (input?: any): number | undefined => { | |||
if (input === null || input === undefined) return 0; | |||
if (typeof input === 'string') { | |||
let len = input.length; | |||
const blob = new Blob([input]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Worried this might introduce compatibility issues on RN. What was wrong with the old logic for string inputs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just aligning the length evaluation with getDataChunker.
I was concerned at the possibility of byteLength
and getDataChunker
determining different sizes.
There shouldn't be any issue with RN, because it already has to work in getDataChunker
. Unless there is an expected RN limitation to singlepart uploads.
@@ -193,6 +195,8 @@ export const getMultipartUploadHandlers = ( | |||
}); | |||
} | |||
|
|||
validateCompletedParts(inProgressUpload.completedParts, size!); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it safe to coerce the size? Maybe we should only run the validation if size is set. Specifically from the inline docs on byteLength
: "The total size is not required for multipart upload, as it's only used in progress report." If we do want to make sure this always run, can we just update the types to not require the assertion?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
byteLength
is a bit odd, because in any case where it would return undefined
the upload is rejected by getDataChunker.
For any valid multipart upload, byteLength
must be able to determine size, which makes sense because we can't do a multipart upload without knowing the size.
I considered updating byteLength
to throw if it can't determine size, but it would change behaviour of that failure case and takes more refactoring.
As for only used in progress report
, it's used throughout the upload process.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it, thanks for the explanation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two things here,
- what happens here when the size is undefined for some reason? i m still unclear on that.
-
As for only used in progress report, it's used throughout the upload process.
for this, can we update inline docs accordingly then?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If size is undefined, the upload will default to a multipart upload. When it gets to getDataChunker
, it will throw an error and upload will fail.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sense, I will make the total length explicitly defined. Right now the contract is implicitely defined but marked as optional.
d013fbe
to
5a9a212
Compare
Rebased the PR and fixed the unit test. A more noticeably change I made is that now |
// If the upload source sticks to the suggested types, the byteLength can be | ||
// determined here. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if it doesn't "stick to the suggested types"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We will throw the same validation error later to when are chunking the data for multipart upload:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should update the comment here, "suggested types" reads as though the upload doesn't have to be the expected shape 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@calebpollman Addressed the comment in 052f11d
(#13763)
packages/storage/src/providers/s3/apis/internal/uploadData/multipart/uploadHandlers.ts
Outdated
Show resolved
Hide resolved
packages/storage/src/providers/s3/apis/internal/uploadData/multipart/uploadHandlers.ts
Show resolved
Hide resolved
packages/storage/__tests__/providers/s3/utils/getCombinedCrc32.test.ts
Outdated
Show resolved
Hide resolved
4ff7b52
into
aws-amplify:storage-browser/integrity
Description of changes
byteLength
to align with upload part generator. When validating parts, it's required to have the same size calculation. Other data types are already aligned.Issue #, if available
Description of how you validated changes
Checklist
yarn test
passesChecklist for repo maintainers
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.