Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama3.1全参微调完后 tokenizer_config.json文件显示异常 #6447

Closed
1 task done
haidixipan1 opened this issue Dec 25, 2024 · 1 comment
Closed
1 task done
Labels
solved This problem has been already solved

Comments

@haidixipan1
Copy link

Reminder

  • I have read the README and searched the existing issues.

System Info

1831d5787c6d26f35bed10e17503072
如图所示 model_max_length明显不对,低版本transformer加载不正常

Reproduction

使用llama-factory进行llama3.1全参微调

Expected behavior

模型低版本配置加载不成功 ,tokenizer_config显示不正常

Others

No response

@github-actions github-actions bot added the pending This problem is yet to be addressed label Dec 25, 2024
@hiyouga
Copy link
Owner

hiyouga commented Dec 25, 2024

把原模型的复制并覆盖到新目录

@hiyouga hiyouga closed this as completed Dec 25, 2024
@hiyouga hiyouga added solved This problem has been already solved and removed pending This problem is yet to be addressed labels Dec 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

2 participants