Copilot Support? #908
-
This documentation suggests LCP can be used with GitHub Copilot if you have an OpenAI compatible endpoint. https://github.com/abetlen/llama-cpp-python/blob/main/docs/server.md#code-completion I have a working server endpoint, but I get prompted to sign in and later it says I don't have a subscription. Does it actually work with Copilot? Do you need to have a subscription for it to work? Does anyone have this working with LCP? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 4 replies
-
And you updated your This might be an annoying limitation of the plugin I haven't run into because I also have a copilot subscription. |
Beta Was this translation helpful? Give feedback.
-
Ah! ok so I do indeed need the sub to to past the "you need a sub" wall. I'll close this as answered ty |
Beta Was this translation helpful? Give feedback.
-
I suppose this is still not working directly as a drop-in replacement for GitHub Copilot, right? I don't see the endpoint gets called at all after following all steps. And I do not have a Copilot subscription. Continue works but not quite good as Copilot. If this is the case, the documentation probably needs an update. Another problem is that llama.cpp has problem loading replit-code Q4 & Q8 gguf models. Only f16 works for me. |
Beta Was this translation helpful? Give feedback.
And you updated your
.vscode/settings.json
as suggested?This might be an annoying limitation of the plugin I haven't run into because I also have a copilot subscription.