Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local LLM pandasai.json #1488

Open
ahadda5 opened this issue Dec 19, 2024 · 7 comments
Open

Local LLM pandasai.json #1488

ahadda5 opened this issue Dec 19, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@ahadda5
Copy link

ahadda5 commented Dec 19, 2024

System Info

"name": "pandasai-all",
"version": "1.0.0",
MacOS (15.1.1)

The code is run directly as Poetry run. and not in a docker.

🐛 Describe the bug

I'm trying to use a local LLM but it keeps defaulting to BamboLLM
Here how the pandasai.json at the root directory looks like

Regardless what "llm" variable i use it defaults to BambooLLM!

  "llm": "LLM",
  "llm_options": {
    "model": "Llama-3.3-70B-Instruct",
    "api_url": "http://localhost:9000/v1"
  }

This is the supported list

__all__ = [
    "LLM",
    "BambooLLM",
    "AzureOpenAI",
    "OpenAI",
    "GooglePalm",
    "GoogleVertexAI",
    "GoogleGemini",
    "HuggingFaceTextGen",
    "LangchainLLM",
    "BedrockClaude",
    "IBMwatsonx",
]
Copy link

dosubot bot commented Dec 19, 2024

Hey there, @ahadda5! 👋 I'm here to help you with any bugs, questions, and even tips to become a contributor. I'm Dosu, a bot designed for just that!

The issue you're experiencing, where the local LLM defaults to BambooLLM regardless of the "llm" variable specified in pandasai.json, is due to the configuration loading logic in pandasai/config.py. This logic sets BambooLLM as the default when no specific LLM is provided in the configuration file or through an override. If the pandasai.json file does not specify an "llm" or if the file is not found, the code explicitly assigns BambooLLM as the default LLM [1].

To resolve this, ensure that the "llm" key in your pandasai.json file matches exactly with one of the supported LLM names listed in the code. In your case, you should verify that "LLM" is correctly implemented and recognized in the codebase. If "LLM" is not a valid or recognized class, it will default to BambooLLM. You might need to use one of the other supported LLM names, such as "AzureOpenAI" or "OpenAI", if "LLM" is not properly defined.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the bug Something isn't working label Dec 19, 2024
@ahadda5
Copy link
Author

ahadda5 commented Dec 19, 2024

@dosu yes LLM is found in the pandasai.json and the supported list.
Talk to me about the override_config in the file you mentioned pandasai/config.py it seems the bug is that its defaulting to llm.BambooLLM regardless what is the "llm" variable is.

Copy link

dosubot bot commented Dec 19, 2024

The override_config in pandasai/config.py is used to provide a mechanism to override the default configuration loaded from the pandasai.json file. If the "llm" variable is not specified in either the pandasai.json file or the override_config, the code defaults to using BambooLLM. This is evident from the following part of the code:

if config.get("llm") and not override_config.get("llm"):
    options = config.get("llm_options") or {}
    config["llm"] = getattr(llm, config["llm"])(**options)
elif not config.get("llm") and not override_config.get("llm"):
    config["llm"] = llm.BambooLLM()

This logic ensures that if neither the configuration file nor the override configuration specifies an "llm", the system defaults to BambooLLM [1].

If your pandasai.json correctly specifies "LLM" and it's still defaulting to BambooLLM, you might want to check if there's an issue with how the configuration is being loaded or if the override_config is unintentionally set to an empty or incorrect value.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@ahadda5
Copy link
Author

ahadda5 commented Dec 19, 2024

Its a bug. Here are the highlights

  • pandasai/llm/__init__.py needs to export localLLM
  • pandasai/schemas/df_config.py needs to have not isinstance(llm, (LLM, LocalLLM, LangChainLLM) to avoid returning it as BambooLLM
  • Also not sure, but the llm_options doesn't seem to be passed down stream

@gdcsinaptik
Copy link
Collaborator

Hey @ahadda5 are you using the latest version of PandasAI (2.4.1)?

@ahadda5
Copy link
Author

ahadda5 commented Dec 19, 2024

i pulled the main branch yesterday. not sure what version.

@ahadda5
Copy link
Author

ahadda5 commented Dec 20, 2024

so my pandasai.json was not found which definitly didn't help. The working directory was in ./server and couldn't find the pandasai.json at the root level.
Also, I changed the config.py file to return localLLM and i can now connect .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants