You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If we want to do lots of interests, optimizing the process definitely wouldn't hurt.
I imagine our bottleneck now is calls to Google Location services, which will be lowered as our cache builds up. It may be worth it to separate the concerns a bit and build a queue of raw locations needed and then have that run separately in parallel. Just a thought, and it may not even be worth our time for now.
The text was updated successfully, but these errors were encountered:
Very interesting idea and I completely agree that we should do this, I just don't know when. I think it's important that we start getting baseline results and tracking how long it takes to run through the script. Building another app to collect the stat objects so we can browse through them may be a good next step. If we can include time stamps for each function (or section of the process) that would really let us know where the bottleneck is and it would give us something to compare any future improvements with.
If we want to do lots of interests, optimizing the process definitely wouldn't hurt.
I imagine our bottleneck now is calls to Google Location services, which will be lowered as our cache builds up. It may be worth it to separate the concerns a bit and build a queue of raw locations needed and then have that run separately in parallel. Just a thought, and it may not even be worth our time for now.
The text was updated successfully, but these errors were encountered: