Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load model from local #1553

Open
thangnv02 opened this issue Jan 18, 2025 · 1 comment
Open

Load model from local #1553

thangnv02 opened this issue Jan 18, 2025 · 1 comment

Comments

@thangnv02
Copy link

thangnv02 commented Jan 18, 2025

model, tokenizer = FastLanguageModel.from_pretrained(
    model_name = local_model_path,
    max_seq_length = max_seq_length,
    dtype = dtype,
    load_in_4bit = load_in_4bit,
    cache_dir = my cache_dir,
)

ERROR: OSError: Error no file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory /content/drive/MyDrive/unsloth/models--unsloth--llama-3.2-1b-instruct-bnb-4bit.

Sth wrong, how can i load from downloaded model

@danielhanchen
Copy link
Contributor

danielhanchen commented Jan 19, 2025

@thangnv02 Can you show whats in the local dir? (Screenshot or ls)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants