You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
And I want to fine tune with mlx-community/Meta-Llama-3.1-8B-Instruct-4bit, and it has the errors.
Not sure you have got the chance to try it.
(mlx-env) ➜ qlora-mlx git:(main) ✗ python scripts/lora.py --model mlx-community/Meta-Llama-3.1-8B-Instruct-4bit --iters 100 --steps-per-eval 10 --val-batches -1 --learning-rate 1e-5 --lora-layers 16 --test
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Loading pretrained model
Traceback (most recent call last):
File "/workspace/YouTube-Blog/LLMs/qlora-mlx/scripts/lora.py", line 336, in <module>
model, tokenizer, _ = lora_utils.load(args.model, tokenizer_config)
File "/workspace/YouTube-Blog/LLMs/qlora-mlx/scripts/utils.py", line 149, in load
model_args = models.ModelArgs.from_dict(config)
File "/workspace/YouTube-Blog/LLMs/qlora-mlx/scripts/models.py", line 40, in from_dict
return cls(
File "<string>", line 14, in __init__
File "/workspace/YouTube-Blog/LLMs/qlora-mlx/scripts/models.py", line 33, in __post_init__
raise ValueError(f"rope_scaling must contain keys {required_keys}")
ValueError: rope_scaling must contain keys {'factor', 'type'}
The text was updated successfully, but these errors were encountered:
@pjq@straussbastian Thanks for raising this! It might be tricky since the original code was made for Mistral and Llama3, but I'll spend some time on it and share my results here.
If anyone makes any headway any insights would be appreciated :)
Really appreciate for the youtube video for finetune with Mac M1,
And I can run the finetune successfully on my Mac M1.
And I want to fine tune with
mlx-community/Meta-Llama-3.1-8B-Instruct-4bit
, and it has the errors.Not sure you have got the chance to try it.
The text was updated successfully, but these errors were encountered: