Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MAX_POOLS refactor #17

Merged
merged 3 commits into from
Sep 11, 2024
Merged

MAX_POOLS refactor #17

merged 3 commits into from
Sep 11, 2024

Conversation

ethzoomer
Copy link

Changes the output of the internal _pools() function to be the fixed size of MAX_POOLS. The output of _pools() no longer pads the front of the array with placeholder data. MAX_ITERATIONS is a new higher constant that determines the number of pools that can be looped through. MAX_POOLS shouldn't be increased going forward

@ethzoomer ethzoomer requested a review from stas September 10, 2024 18:40
Copy link

@stas stas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great!

Left some notes as I'm concerned an arbitrary _limit + _offset could lead to an overflow if we don't have checks.

Also, could you please add a test for the call to tokens(2000, 1000, 0x0000000000000000000000000000000000000000, [])

This seems to fail for the same reason we have this PR.

for pindex in range(0, MAX_POOLS):
if pindex >= pools_count or len(pools) >= _limit + _offset:
for pindex in range(0, MAX_ITERATIONS):
if pindex >= pools_count or visited >= _limit + _offset:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll need to add here also the len(pools) >= MAX_POOLS otherwise we risk overflowing again since the MAX_ITERATIONS is larger than MAX_POOLS.

@@ -324,7 +326,7 @@ def forSwaps(_limit: uint256, _offset: uint256) -> DynArray[SwapLp, MAX_POOLS]:
nfpm: address = self._fetch_nfpm(factory.address)
pools_count: uint256 = factory.allPoolsLength()

for pindex in range(0, MAX_POOLS):
for pindex in range(0, MAX_ITERATIONS):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll need a check to break for len(pools) >= MAX_POOLS now that there might be more iterations.

@@ -394,7 +396,7 @@ def tokens(_limit: uint256, _offset: uint256, _account: address, \
col.append(self._token(_addresses[index], _account))
seen.append(_addresses[index])

for index in range(_offset, _offset + MAX_TOKENS):
for index in range(0, MAX_TOKENS):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking we'll need to change this to MAX_POOLS instead of MAX_TOKENS, to keep the iterations bounded to the size of the pools.

@ethzoomer ethzoomer requested a review from stas September 10, 2024 23:30
@ethzoomer ethzoomer merged commit fe06759 into v3 Sep 11, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants