Replies: 1 comment
-
minicpm has been incorporated into the latest version of llama.cpp (yet I'm not sure exactly how to use it with the non openbmb llama.cpp). Any thoughts on supporting the model for use with text-generation-webui? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have loaded this model: https://huggingface.co/openbmb/MiniCPM-V with transformers (trust0remote0code activated) into webui.
However, when I try to generating response, it gave me this error:
Any help will be appreciated!
Beta Was this translation helpful? Give feedback.
All reactions