Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compressed lifecycle implementation (INT8 only) #33

Merged
merged 17 commits into from
May 7, 2024
Merged

Conversation

bfineran
Copy link
Contributor

implements COMPRESSED piece of quantization lifecycle. currently assumes int8 format and will clip for any num_bits less than 8

test_plan:
extends apply test to test compression phase as well

@bfineran bfineran self-assigned this Apr 22, 2024
Copy link
Contributor

@Satrat Satrat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to get runtime unblocked, but it would be good to implement all compression under a QuantizationCompressor parent class similar to how we do for SparsityCompressor. Also, I think we need to discuss more how we want to handle quantization status being different for different layers (or if we don't want to support that)

Satrat
Satrat previously approved these changes May 3, 2024
@Satrat Satrat self-requested a review May 3, 2024 20:11
Satrat
Satrat previously approved these changes May 3, 2024
@Satrat Satrat self-requested a review May 7, 2024 15:49
Satrat
Satrat previously approved these changes May 7, 2024
@Satrat Satrat merged commit 964276d into main May 7, 2024
2 checks passed
@Satrat Satrat deleted the compressed-lifecycle branch May 7, 2024 16:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants