-
-
Notifications
You must be signed in to change notification settings - Fork 745
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limiting concurrency #282
Comments
Duplicate of #242. One of the suggested solutions is to use promise-concurrent or some other library that can handle that. |
But that's not a good solution because it involves making multiple PhantomJS processes, each of which has a significant startup time and memory overhead. By sharing a single PhantomJS process you can save a huge amount of memory and time for large batches of URLs. I am willing to do a PR for this if you would consider it. |
@callumlocke, you'd have to do it in |
I'm trying to take 1000 screenshots of different URLs on my localhost, and running into problems.
The problem with option 1 is it takes forever (I guess because it spins up a new Phantom.js for every URL).
The problem with option 2 is it seems to hit my local web server with 1000 requests all at the same time, which it can't handle.
I want to do the option 2 approach, but throttled so it only does 10 requests at any one time. Something like
const pageres = new Pageres({ concurrency: 10 });
. Would that be possible?The text was updated successfully, but these errors were encountered: