This repository has been archived by the owner on Apr 26, 2021. It is now read-only.
Adding limit to total size of uploaded files from analysis #3169
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Thanks for contributing! But first: did you read our community guidelines?
https://cuckoo.sh/docs/introduction/community.html
What I have added/changed is:
If you pass
max_total_size_of_uploaded_files=[0|int>0]
as a task option, the analyzer will only upload files from analysis up until the total size of these files could exceed the limit.The goal of my change is:
There are samples such as this that behave in such a way that the Cuckoo analyzer ends up uploading ~1.3Gb of files and ~2.8Gb of logs to the
resultserver
, which may cause issues depending on how you are using Cuckoo's analysis post-analysis (ex. Using the REST API to retrieve a tarball that contains all of this!💀). Therefore, I propose that there should be an option to limit the total size of files that are to be uploaded from analysis.What I have tested about my change is:
Manual testing.
Let me know what you think!