-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] Add support for Ollama assistants #376
[ENH] Add support for Ollama assistants #376
Conversation
1c7ffa6
to
85030be
Compare
Other models will be added in a later step
85030be
to
fd5c34b
Compare
Also fix typo in `ragna.assistants.__init__`
ragna/assistants/_ollama.py
Outdated
if "error" in json_data: | ||
raise RagnaException(json_data["error"]) | ||
if not json_data["done"]: | ||
yield cast(str, json_data["message"]["content"]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nenb This violates "same response schema" part of #425 (comment) 😖
@nenb As follow-up to #376 (comment), I've refactored the logic we merged in #425 a little to make it even more flexible.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks fine to me.
I've pulled the branch locally and confirmed that it works for OpenAI models and for a Llamafile that I had locally.
Co-authored-by: Philip Meier <[email protected]>
This PR adds support for Ollama assistants.