Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: API error: Error code: 400 - {'error': {'message': "Unknown parameter: 'stream_options' #1032

Open
demaxiya567 opened this issue Jun 23, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@demaxiya567
Copy link

Version

VisualStudio Code extension

Operating System

Windows 11

What happened?

After update the extension in 20240622,when i use gpt-4o apikey occered error "API check for openai failed with: Error connecting to the LLM: API error: Error code: 400 - {'error': {'message': "Unknown parameter: 'stream_options'. (request id: 2024062310172985662975484060090) (request id: 202406231017298283504288geyfPln)", 'type': 'invalid_request_error', 'param': 'stream_options', 'code': 'unknown_parameter'}} ;I keep getting this error whether I'm importing a project or creating a new one

@demaxiya567 demaxiya567 added the bug Something isn't working label Jun 23, 2024
@JianYingJia
Copy link

hello!! Could you tell me how to solve that?

@LeonOstrez
Copy link
Member

how did you set gpt-4o model? Can you share config.json but remember to remove your API key when sending it here

@demaxiya567
Copy link
Author

how did you set gpt-4o model? Can you share config.json but remember to remove your API key when sending it here

Here is my setting:

{
"llm": {
"openai": {
"base_url": ,
"api_key": ,
"connect_timeout": 60.0,
"read_timeout": 10.0,
"extra": null
}
},
"agent": {
"default": {
"provider": "openai",
"model": "gpt-4o",
"temperature": 0.5
},
"CodeMonkey.describe_files": {
"provider": "openai",
"model": "deepseek-v2",
"temperature": 0.0
},
"Troubleshooter.get_route_files": {
"provider": "openai",
"model": "gpt-4o",
"temperature": 0.0
}
},
"prompt": {
"paths": [
"c:\software\gpt-pilot\core\prompts"
]
},
"log": {
"level": "DEBUG",
"format": "%(asctime)s %(levelname)s [%(name)s] %(message)s",
"output": "pythagora.log"
},
"db": {
"url": "sqlite+aiosqlite:///pythagora.db",
"debug_sql": false
},
"ui": {
"type": "plain"
},
"fs": {
"type": "local",
"workspace_root": "c:\software\gpt-pilot\workspace",
"ignore_paths": [
".git",
".gpt-pilot",
".idea",
".vscode",
".next",
".DS_Store",
"pycache",
"site-packages",
"node_modules",
"package-lock.json",
"venv",
".venv",
"dist",
"build",
"target",
".min.js",
".min.css",
".svg",
".csv",
"*.log",
"go.sum"
],
"ignore_size_threshold": 50000
}
}

@quloos
Copy link

quloos commented Jul 2, 2024

I came across similar issue with claude3.5 sonnet. Every .html file seems good to me.

Error connecting to the LLM: API error: Error code: 400 - {'error': {'message': "Unable to render prompt template:
Unexpected end of template. Jinja was looking for the following tags: 'endblock'. The innermost block that needs to be closed is 'block'. (request id: 2024070209463069627827436227569)", 'type': 'invalid_request_error', 'param': '', 'code': 400}}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants