Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Pythagora cannot start because the LLM API is not reachable. #1081

Open
blap opened this issue Aug 8, 2024 · 1 comment
Open

[Bug]: Pythagora cannot start because the LLM API is not reachable. #1081

blap opened this issue Aug 8, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@blap
Copy link

blap commented Aug 8, 2024

Version

Command-line (Python) version

Operating System

Windows 10

What happened?

Edited config.json to use OpenRouter:


// Configuration for the LLM providers that can be used. Pythagora supports
  // OpenAI, Azure, Anthropic and Groq. OpenRouter and local LLMs (such as LM-Studio)
  // also work, you can use "openai" provider to define these.
  "llm": {
    "openai": {
      // Base url to the provider/server, omitting the trailing "chat/completions" part.
      "base_url": "https://openrouter.ai/api/v1",
      "api_key": "key",
      "connect_timeout": 60.0,
      "read_timeout": 10.0
    },



  // Each agent can use a different model or configuration. The default, as before, is GPT4 Turbo
  // for most tasks and GPT3.5 Turbo to generate file descriptions. The agent name here should match
  // the Python class name.
  "agent": {
    "default": {
      "provider": "openai",
      "model": "recursal/eagle-7b"


running python main.py:

[Pythagora] Pythagora cannot start because the LLM API is not reachable.

What I did wrong?

@blap blap added the bug Something isn't working label Aug 8, 2024
@nsrane
Copy link

nsrane commented Aug 29, 2024

me too facing same problem , when I checked LMstudio logs it shows it is receiving request for endpoint "openai/v1/chat/completions" instead of "v1/chat/completions". looks like Pythagoras is appending openai to the endpoint. somehow need o remove this append for use of local llm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants