You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This allows me to quickly test my local models, and see how i want to configure them. Once i've confirmed (through the GUI) that everything is how i want it, it would be great if we can expose this as an API. So it's not only a CLIENT, but now can operate as a LlamaServer.
Is this something that you'd be willing to consider?
The text was updated successfully, but these errors were encountered:
@mkellerman interesting use-case, nice! you mean a local server or one running remotely that you can query? i'm actually working on some new Swift bindings to power LlamaChat v2 that you can find here: https://github.com/CameLLM
@mkellerman sorry for the late reply on this. not something I'm considering actively atm but will track it for the future! might be something that could wrap CameLLM and be released as a separate project.
I love this! Nice work!
This allows me to quickly test my local models, and see how i want to configure them. Once i've confirmed (through the GUI) that everything is how i want it, it would be great if we can expose this as an API. So it's not only a CLIENT, but now can operate as a LlamaServer.
Is this something that you'd be willing to consider?
The text was updated successfully, but these errors were encountered: