Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Checkpointing flag of Attention Block is not correctly set up #153

Merged
merged 1 commit into from
Dec 3, 2024

Conversation

jhliu17
Copy link
Contributor

@jhliu17 jhliu17 commented Dec 2, 2024

Checkpointing flag of Attention Block is not correctly set up

The Attention Block was always using checkpointing by setting the flag to True. This pull request fixes this issue by respecting the use_checkpoint parameter.

Before submitting

  • Did you make sure title is self-explanatory and the description concisely explains the PR?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you test your PR locally with pytest command?
  • Did you run pre-commit hooks with pre-commit run -a command?

Did you have fun?

Make sure you had fun coding 🙃

@atong01
Copy link
Owner

atong01 commented Dec 3, 2024

Thanks for catching this! LGTM.

@atong01 atong01 merged commit bb20577 into atong01:main Dec 3, 2024
35 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants