-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: enable OpenAI models #307
Conversation
- Added OPENAI_API_KEY to the Env interface - Added OPENAI_API_KEY to package.json dependencies - Added @ai-sdk/openai to package.json dependencies - Added Prompts interface to prompts-interface.ts - Added Prompts implementation to anthropic-llm.ts and openai-llm.ts - Added LLMType enum to llm-selector.ts - Added selectLLM and getCurrentLLMType functions to llm-selector.ts - Added AnthropicLLM and OpenAILLM classes to anthropic-llm.ts and openai-llm.ts - Added getModel function to model.ts - Added streamText function to stream-text.ts - Updated chatAction function in api.chat.ts to use selectLLM and getCurrentLLMType
Refactor the OpenAI LLM implementation in the `openai-llm.ts` file. - Update the model selection logic to support both 'gpt-4o' and 'o1-mini' models. - Add conditional logic to handle different models and their respective prompts and options.
Could you add changing the base url so you can use something like Deepseek? |
Sure, one certainly could do that. For my purposes, this is a stab at getting the initial support to be functional. I believe that the next refactor should include a more-robust model selector (possibly via UI), and way of handling more of the providers that Vercel's AI SDK supports: https://sdk.vercel.ai/providers/ai-sdk-providers For instance, I'd also like to be able to use Anthropic's models via GCP or OpenAI's models via Azure, or Groq with one of the Llama3 models, and a different system prompt... |
OpenRouter is much more favorable - https://openrouter.ai/docs/requests |
Hey! Thanks for the PR 🙏 I think for this repo we'd like to stick with Anthropic only. But if you want to fork this repo and add more providers then you can absolutely do that! |
This PR enables the use of gpt-4o, gpt-4o-mini, o1-preview, and o1-mini.
There are a few new environment variables:
LLM_TYPE=(anthropic|openai) // defaults to anthropic
OPENAI_MODEL=gpt-4o // one of the above, must be present, no default
OPENAI_API_KEY=sk... // OpenAI key here
.toAIStream() has been deprecated in the newer Vercel AI SDK, and did not work with the OpenAI LLM implementation. This has been updated to use .toDataStream(...)
Notes: