Any existing Alpaca models out there? #491
Replies: 2 comments 3 replies
-
The only version I know of that was made with native training is alpaca-native trained by dep on the SAIL discord. There are some others that were trained with LORA going up to 30B but they don't follow instructions as well from my experience. You can definitely run it with 10GB but you may need to convert the weights to 4bit or offload to ram. |
Beta Was this translation helpful? Give feedback.
-
Does anyone know of a way to run the Alpaca LORA 30b models on GPU with text-generation-webui? I'm thinking this model specifically: https://huggingface.co/elinas/alpaca-30b-lora-int4 (or any other Alpaca 30b int4 models that may come along). Since it's basically just a fine-tuned LLaMA 30b model, does that mean I can load it in the same procedure with text-generation-webui as I would the LLaMA models? I would love to try it for myself, but I don't have the hardware to do it locally and I want to hear if anybody knows if it's possible before I look into running it in the cloud. |
Beta Was this translation helpful? Give feedback.
-
I'm struggling to find the latest Alpaca model that was given additional training via Divinci 3
I have an RTX 3080 10gb and have no idea if it's even strong enough to run it
Any help is appreciated
Beta Was this translation helpful? Give feedback.
All reactions