-
Notifications
You must be signed in to change notification settings - Fork 761
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Disable code_interpreter in tool-calling agent #407
Comments
I've been able to reproduce: https://github.com/aidando73/llama-stack-apps/pull/3/files#diff-1ebfaf6cb3592166b73835fa82333cb7109e7c624865c0039a7b22ff34aa27fa Traceback (most recent call last):
File "/Users/aidand/dev/llama-stack/llama_stack/distribution/server/server.py", line 158, in sse_generator
async for item in event_gen:
File "/Users/aidand/dev/llama-stack/llama_stack/providers/inline/agents/meta_reference/agents.py", line 153, in _create_agent_turn_streaming
async for event in agent.create_and_execute_turn(request):
File "/Users/aidand/dev/llama-stack/llama_stack/providers/inline/agents/meta_reference/agent_instance.py", line 179, in create_and_execute_turn
async for chunk in self.run(
File "/Users/aidand/dev/llama-stack/llama_stack/providers/inline/agents/meta_reference/agent_instance.py", line 250, in run
async for res in self._run(
File "/Users/aidand/dev/llama-stack/llama_stack/providers/inline/agents/meta_reference/agent_instance.py", line 568, in _run
result_messages = await execute_tool_call_maybe(
File "/Users/aidand/dev/llama-stack/llama_stack/providers/inline/agents/meta_reference/agent_instance.py", line 833, in execute_tool_call_maybe
assert name in tools_dict, f"Tool {name} not found"
AssertionError: Tool code_interpreter not found
[INFO] role='assistant' content='' stop_reason=<StopReason.end_of_turn: 'end_of_turn'> tool_calls=[ToolCall(call_id='effb0bb7-ebb2-4baf-8a3a-941c99dc0cca', tool_name=<BuiltinTool.code_interpreter: 'code_interpreter'>, arguments={'code': 'import os\nfrom llama_index import LLaMAIndex\nfrom llama_index.finer_tuning import fine_tune_orma\n\n\n# Initialize the LLaMA index\nllama_index = LLaMAIndex()\n\n\ndef main():\n # Load the pre-trained Llama model\n model_name = "large"\n model_path = os.path.join(llama_index.model_dir, f"{model_name}.pth")\n\n # Fine-tune the loaded model on a custom dataset\n fine_tune_orma(\n model_path=model_path,\n train_data_path="path/to/custom/train/data",\n eval_data_path="path/to/custom/eval/data",\n batch_size=32,\n num_epochs=5,\n )\n\n\nif __name__ == "__main__":\n main()'})] Let me look into it |
tl;dr: The model is hallucinating and we don't check whether the client passed in the `code_interpreter` tool.The inference request the model gets is correct: tools=[ToolDefinition(tool_name=<BuiltinTool.brave_search: 'brave_search'>, description=None, parameters=None)] Doesn't include But the raw message I get back is: <|python_tag|>import os
from llama_index import llama_recipes
from llama_index.finetuning import main
def fine_tune_lamaguard():
# Define the path to the finetuning configuration file
... The llama-stack/llama_stack/providers/utils/inference/openai_compat.py Lines 230 to 240 in cde9bc1
I've submitted a PR so the agent doesn't call the tool if we haven't enabled it: #637 As for your example @subramen - ooc in your user prompt it looks like there's additional stuff (it looks like some assistant responses as well?), what's stopping you from doing something like this?
Which doesn't cause the code_interpreter tool call
wdyt? |
System Info
..
Information
🐛 Describe the bug
I am setting up a search agent exactly as shown here: https://github.com/meta-llama/llama-stack-apps/blob/7c92eb274924b38b110ca1759dd487817980e5af/examples/agents/client.py#L38
Despite no instructions to write or execute code, the agent automatically invokes code_interpreter and errors out with
AssertionError: Tool code_interpreter not found
. This appears to happen when the assistant response contains any code.How do I explicitly disable code_interpreter?
Error logs
Expected behavior
don't call code_interpreter, just use search.
The text was updated successfully, but these errors were encountered: