-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Local LLM pandasai.json #1488
Comments
Hey there, @ahadda5! 👋 I'm here to help you with any bugs, questions, and even tips to become a contributor. I'm Dosu, a bot designed for just that! The issue you're experiencing, where the local LLM defaults to To resolve this, ensure that the "llm" key in your To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@dosu yes LLM is found in the |
The if config.get("llm") and not override_config.get("llm"):
options = config.get("llm_options") or {}
config["llm"] = getattr(llm, config["llm"])(**options)
elif not config.get("llm") and not override_config.get("llm"):
config["llm"] = llm.BambooLLM() This logic ensures that if neither the configuration file nor the override configuration specifies an "llm", the system defaults to If your To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Its a bug. Here are the highlights
|
Hey @ahadda5 are you using the latest version of PandasAI (2.4.1)? |
i pulled the main branch yesterday. not sure what version. |
so my pandasai.json was not found which definitly didn't help. The working directory was in ./server and couldn't find the pandasai.json at the root level. |
System Info
"name": "pandasai-all",
"version": "1.0.0",
MacOS (15.1.1)
The code is run directly as Poetry run. and not in a docker.
🐛 Describe the bug
I'm trying to use a local LLM but it keeps defaulting to BamboLLM
Here how the pandasai.json at the root directory looks like
Regardless what "llm" variable i use it defaults to BambooLLM!
This is the supported list
The text was updated successfully, but these errors were encountered: