Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: update lora notebook + fix order inputs in llama #967

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

jfrery
Copy link
Collaborator

@jfrery jfrery commented Dec 20, 2024

No description provided.

@cla-bot cla-bot bot added the cla-signed label Dec 20, 2024
@@ -4,6 +4,6 @@ peft==0.12.0
Jinja2==3.1.4
matplotlib==3.7.5
datasets==3.1.0
accelerate==1.2.0
accelerate==1.0.1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which package the 1.2.0 version had conflicts with?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1.2.0 isn't compatible with python 3.8. I will probably revert this and add a comment that we don't support 3.8 in this use case

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes please add a comment

@@ -134,7 +133,7 @@
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "9775e413ec264b2eb14ee53dbc381474",
Copy link
Collaborator

@kcelia kcelia Dec 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the PR is red, I suggest some updates:

# Apply LoRA  to the model.
# The 'target_modules' parameter can be set to "all-linear" to apply LoRA to all linear modules. 
# By default, only the 'c_attn' projection layers are fine-tuned with LoRA.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can remove the extra spaces between the comment and the function:

# Since we've already handled padding and labels, we can use a custom data collator


def data_collator(features):

@jfrery jfrery force-pushed the update_lora_notebook branch from 56d002c to c0094ae Compare December 20, 2024 13:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants