Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log warning for batch_size 1 #1503

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

log warning for batch_size 1 #1503

wants to merge 3 commits into from

Conversation

horheynm
Copy link
Member

@horheynm horheynm commented Jan 3, 2024

Show warning when overriding batch_size 0 to 1

https://app.asana.com/0/1201735099598270/1206262288703592

@horheynm horheynm marked this pull request as ready for review January 3, 2024 22:31
@dbarbuzzi
Copy link
Contributor

dbarbuzzi commented Jan 3, 2024

Could this be made to be more in-line with the already existing validation check, or perhaps, just use that check instead? See https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/engine.py#L108

Separately, while not a huge deal, I would be curious to know why the existing check catches, e.g. -3, but not 0, within the same sample code otherwise. The test that catches the missing warning when batch_size is 0 is simply:

Pipeline.create(task="text_classification", batch_size=batch_size)

In that test, the existing warning is thrown if batch_size is -3, but not if it’s 0, which seems odd, almost like 0 is being handled separately somewhere else before the check (perhaps due to a truthiness/falsiness check of batch_size since 0 would be falsey but -3 would be truthy).

@derekk-nm
Copy link

@dbarbuzzi , good catch. @horheynm, it may be good to understand why pipeline.py overrides batch_size when engine.py is already doing it differently? and if the statement on the old lines 195 and 209 is going to stay around, they should probably be refactored to be a single line in whatever is the appropriate place in the init function.

@horheynm
Copy link
Member Author

horheynm commented Jan 4, 2024

Could this be made to be more in-line with the already existing validation check, or perhaps, just use that check instead? See https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/engine.py#L108

Separately, while not a huge deal, I would be curious to know why the existing check catches, e.g. -3, but not 0, within the same sample code otherwise. The test that catches the missing warning when batch_size is 0 is simply:

Pipeline.create(task="text_classification", batch_size=batch_size)

In that test, the existing warning is thrown if batch_size is -3, but not if it’s 0, which seems odd, almost like 0 is being handled separately somewhere else before the check (perhaps due to a truthiness/falsiness check of batch_size since 0 would be falsey but -3 would be truthy).

Yeah I actually wanted the change there for overriding the values, but then i dont understand

  1. why the current override happens in the location near where I change currently
  2. because it is a common highway in the code base, changing it would cause issues in other pipelines.

I do prefer changing the values in the locs you sent bc I also do have to initialize a new logger.

If there are no conflicts changing the code snippet you mentioned, then ofc its better to make edits to that line.

@bfineran Any issues changing the snippet of code in engine.py?
https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/engine.py#L108

@mgoin
Copy link
Member

mgoin commented Jan 4, 2024

@horheynm @dbarbuzzi @derekk-nm batch_size=0 was added as a special case to disable the engine from setting the batch size of the model. This is important because some models do not have a batch size at all! So we do need a way to express that case with the engine. Please see this PR for the original creation #1169

@dbarbuzzi
Copy link
Contributor

Thanks for that info @mgoin ! So it sounds like the changes we need are not to "fix" any logic, but just to ensure that this special case is properly displayed in a log message?

@dbogunowicz
Copy link
Contributor

@dbarbuzzi
Copy link
Contributor

but it is, isn't it: https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/engine.py#L110

Yes, I misunderstood that it is not exactly 0 that is a special case but anything under 1.

That brings us back to the original problem: when using 0 specifically, that log message is not being triggered/displayed like it is if you use -3. That inconsistency is what this PR is supposed to address.

Copy link
Contributor

@bfineran bfineran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For context, I believe this setting was added so that users who would set the pipeline kwarg batch_size to 0 get the "fake" dynamic batch we had supported - in practice this needs to be set to 1.

There's two options here

  1. we can get rid of this convienence and require users of the legacy pipeline to manually set to 1 for dynamic batch (to get the engine disable behavior, BS would need to be set to <0 instead of <1)
  2. we can get rid of this override, and allow the engine to raise its usual warning even when bs is set to 0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants