Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update documentation to explain the use of OpenAI API compatible models using OpenRouter #1190

Open
srdas opened this issue Jan 6, 2025 · 0 comments
Labels
documentation Improvements or additions to documentation

Comments

@srdas
Copy link
Collaborator

srdas commented Jan 6, 2025

For models that are compatible with the OpenAI library Jupyter AI provides configuration via OpenRouter.
By supporting the configuration of parameters such as api_key, base_url, and model, various large model services compatible with the OpenAI library call methods can be used.

While the functionality exists, it is missing in the documentation, which needs to be updated.
Suggested location in the documents is here: https://jupyter-ai.readthedocs.io/en/latest/users/index.html#model-providers

For example, Deepseek may be used via OpenRouter as shown below. Specify the base url and model, and enter the API key in the OpenRouter slot for the API key:
image
On testing we will see:
image

@srdas srdas added the documentation Improvements or additions to documentation label Jan 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

1 participant