Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: enable OpenAI models #307

Closed
wants to merge 8 commits into from
Closed

Conversation

beejaysea
Copy link

This PR enables the use of gpt-4o, gpt-4o-mini, o1-preview, and o1-mini.

There are a few new environment variables:

LLM_TYPE=(anthropic|openai) // defaults to anthropic
OPENAI_MODEL=gpt-4o // one of the above, must be present, no default
OPENAI_API_KEY=sk... // OpenAI key here

.toAIStream() has been deprecated in the newer Vercel AI SDK, and did not work with the OpenAI LLM implementation. This has been updated to use .toDataStream(...)

Notes:

  • I did not test the cloudflare deploy.
  • The OpenAI prompt needs a little work for gpt-4o, but functions well with the o1-* models.

- Added OPENAI_API_KEY to the Env interface
- Added OPENAI_API_KEY to package.json dependencies
- Added @ai-sdk/openai to package.json dependencies
- Added Prompts interface to prompts-interface.ts
- Added Prompts implementation to anthropic-llm.ts and openai-llm.ts
- Added LLMType enum to llm-selector.ts
- Added selectLLM and getCurrentLLMType functions to llm-selector.ts
- Added AnthropicLLM and OpenAILLM classes to anthropic-llm.ts and openai-llm.ts
- Added getModel function to model.ts
- Added streamText function to stream-text.ts
- Updated chatAction function in api.chat.ts to use selectLLM and getCurrentLLMType
Refactor the OpenAI LLM implementation in the `openai-llm.ts` file.
- Update the model selection logic to support both 'gpt-4o' and 'o1-mini' models.
- Add conditional logic to handle different models and their respective prompts and options.
@beejaysea beejaysea changed the title Enable OpenAI models enhancement: Enable OpenAI models Oct 9, 2024
@beejaysea beejaysea changed the title enhancement: Enable OpenAI models feat: Enable OpenAI models Oct 9, 2024
@beejaysea beejaysea changed the title feat: Enable OpenAI models feat: enable OpenAI models Oct 9, 2024
@swcrazyfan
Copy link

Could you add changing the base url so you can use something like Deepseek?

@beejaysea
Copy link
Author

Could you add changing the base url so you can use something like Deepseek?

Sure, one certainly could do that. For my purposes, this is a stab at getting the initial support to be functional. I believe that the next refactor should include a more-robust model selector (possibly via UI), and way of handling more of the providers that Vercel's AI SDK supports: https://sdk.vercel.ai/providers/ai-sdk-providers For instance, I'd also like to be able to use Anthropic's models via GCP or OpenAI's models via Azure, or Groq with one of the Llama3 models, and a different system prompt...

@Justmalhar
Copy link

OpenRouter is much more favorable - https://openrouter.ai/docs/requests

@d3lm
Copy link
Contributor

d3lm commented Oct 10, 2024

Hey! Thanks for the PR 🙏 I think for this repo we'd like to stick with Anthropic only. But if you want to fork this repo and add more providers then you can absolutely do that!

@d3lm d3lm closed this Oct 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants