You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When fuzzing programes like flvmeta, we found that T-Fuzz has generated too many transformed binaries within 24 hours (du -sh shows 9.1G), which take too much disk space and impact I/O performance.
According to your code, whenever the afl gots stuck, NCCDector will be run to generate a bunch of transformed binaries, and then fuzzing them one by one. Can this be improved by set a maxinum number of generated programs? Or immediately start new fuzzing process after one transformed binary generated?
The text was updated successfully, but these errors were encountered:
When fuzzing programes like
flvmeta
, we found that T-Fuzz has generated too many transformed binaries within 24 hours (du -sh
shows 9.1G), which take too much disk space and impact I/O performance.According to your code, whenever the afl gots stuck, NCCDector will be run to generate a bunch of transformed binaries, and then fuzzing them one by one. Can this be improved by set a maxinum number of generated programs? Or immediately start new fuzzing process after one transformed binary generated?
The text was updated successfully, but these errors were encountered: