Using llama3 via Ollama within FunctionCallingAgentWorker #13431
-
Hello, I want to use local llm within agent worker without using any APIs. Here is the code I'm trying to work around with from llama_index.core.agent import FunctionCallingAgentWorker llm = Ollama(model="llama2") agent_worker = FunctionCallingAgentWorker.from_tools( agent = AgentRunner(agent_worker) Cell In[107], line 4 File ~/miniconda3/envs/DL/lib/python3.10/site-packages/llama_index/core/agent/function_calling/step.py:125, in FunctionCallingAgentWorker.from_tools(cls, tools, tool_retriever, llm, verbose, max_function_calls, callback_manager, system_prompt, prefix_messages, **kwargs) ... ValueError: Model name llama2 does not support function calling API. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
The error you're encountering, "Model name llama2 does not support function calling API," indicates that the To resolve this issue, you need a model that is explicitly marked to support function calling. This typically involves ensuring the model's metadata has the If you're working directly with the
|
Beta Was this translation helpful? Give feedback.
-
@AashiDutt you should use ReActAgentWorker -- the function calling agent worker is only for LLMs with a specific API built-in for tool calling (this includes openai, anthropic, and mistral) |
Beta Was this translation helpful? Give feedback.
@AashiDutt you should use ReActAgentWorker -- the function calling agent worker is only for LLMs with a specific API built-in for tool calling (this includes openai, anthropic, and mistral)