Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Incompatible Model Error" or "JSON Error" #981

Open
sackosindou opened this issue Nov 11, 2024 · 3 comments
Open

"Incompatible Model Error" or "JSON Error" #981

sackosindou opened this issue Nov 11, 2024 · 3 comments

Comments

@sackosindou
Copy link

Hello, I'm encountering an issue with installing and running the GPT Researcher application. After following the installation steps, the application returns several errors when I try to initiate a search.

Steps Followed to Install the Application:
FollowedInstallationSteps

Empty results after research on the application :

ResearchWithoutOuput

Summary of the Errors Encountered:

Incompatible Model Error: I receive a message indicating that my API key does not have access to the gpt-4o-2024-08-06 model. It seems that the required model is not activated for my account, even though I have an active subscription on https://chatgpt.com/.

JSON Format Error: The message "Error in reading JSON, attempting to repair JSON" appears, suggesting an issue with the JSON configuration file. It seems the application is unable to read or load the default.json file correctly.

NoneType Error in JSON Parsing: A TypeError: expected string or bytes-like object, got 'NoneType' error occurs, indicating that the expected response is empty. This could be related to a missing API response or error handling issue in the code.

Here is the code with errors :

INFO: connection open Warning: Configuration not found at 'default'. Using default configuration. Do you mean 'default.json'? ⚠️ Error in reading JSON, attempting to repair JSON Error using json_repair: the JSON object must be str, bytes or bytearray, not NoneType ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 27, in choose_agent response = await create_chat_completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<9 lines>... ) ^ File "C:\Users\Admin\gpt-researcher\gpt_researcher\utils\llm.py", line 60, in create_chat_completion response = await provider.get_chat_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ messages, stream, websocket ^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\Admin\gpt-researcher\gpt_researcher\llm_provider\generic\base.py", line 116, in get_chat_response output = await self.llm.ainvoke(messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 307, in ainvoke llm_result = await self.agenerate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<8 lines>... ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 796, in agenerate_prompt return await self.agenerate( ^^^^^^^^^^^^^^^^^^^^^ prompt_messages, stop=stop, callbacks=callbacks, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 756, in agenerate raise exceptions[0] File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 924, in _agenerate_with_cache result = await self._agenerate( ^^^^^^^^^^^^^^^^^^^^^^ messages, stop=stop, run_manager=run_manager, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_openai\chat_models\base.py", line 860, in _agenerate response = await self.async_client.create(**payload) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\resources\chat\completions.py", line 1661, in create return await self._post( ^^^^^^^^^^^^^^^^^ ...<41 lines>... ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\_base_client.py", line 1839, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\_base_client.py", line 1533, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ ...<5 lines>... ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\_base_client.py", line 1634, in _request raise self._make_status_error_from_response(err.response) from None openai.PermissionDeniedError: Error code: 403 - {'error': {'message': 'Project proj_2VxSRsTQaqjx5PDs2D0LEpindoes not have access to modelgpt-4o-2024-08-06`', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 242, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 60, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\middleware\errors.py", line 152, in call
await self.app(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\middleware\cors.py", line 77, in call
await self.app(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
raise exc
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 735, in app
await route.handle(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 362, in handle
await self.app(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 95, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
raise exc
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 93, in app
await func(session)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\fastapi\routing.py", line 383, in app
await dependant.call(**solved_result.values)
File "C:\Users\Admin\gpt-researcher\backend\server\server.py", line 110, in websocket_endpoint
await handle_websocket_communication(websocket, manager)
File "C:\Users\Admin\gpt-researcher\backend\server\server_utils.py", line 121, in handle_websocket_communication
await handle_start_command(websocket, data, manager)
File "C:\Users\Admin\gpt-researcher\backend\server\server_utils.py", line 28, in handle_start_command
report = await manager.start_streaming(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
task, report_type, report_source, source_urls, tone, websocket, headers
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\Users\Admin\gpt-researcher\backend\server\websocket_manager.py", line 66, in start_streaming
report = await run_agent(task, report_type, report_source, source_urls, tone, websocket, headers = headers, config_path = config_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\gpt-researcher\backend\server\websocket_manager.py", line 108, in run_agent
report = await researcher.run()
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\gpt-researcher\backend\report_type\basic_report\basic_report.py", line 41, in run
await researcher.conduct_research()
File "C:\Users\Admin\gpt-researcher\gpt_researcher\agent.py", line 88, in conduct_research
self.agent, self.role = await choose_agent(
^^^^^^^^^^^^^^^^^^^
...<5 lines>...
)
^
File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 44, in choose_agent
return await handle_json_error(response)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 55, in handle_json_error
json_string = extract_json_with_regex(response)
File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 71, in extract_json_with_regex
json_match = re.search(r"{.*?}", response, re.DOTALL)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\re_init_.py", line 177, in search
return _compile(pattern, flags).search(string)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^
TypeError: expected string or bytes-like object, got 'NoneType'
INFO: connection closed
INFO: 127.0.0.1:64127 - "GET / HTTP/1.1" 200 OK
`

@assafelovic
Copy link
Owner

@sackosindou it seems like your OpenAI account does not have permissions to the proposed model or is not configured well. Did you export the OpenAI api key? Also try changing the model to gpt-4o

@sackosindou
Copy link
Author

1 - Export of the OpenAI API Key

I had already exported the OpenAI API key in the .env file.

Location of file .env :
FileEnvLocation

Content of file .env :
FileEnvContent

2 - Changing in model to gpt-4o

I have just updated Model here :

UpdatedModel

3 - Code errors after model changed

INFO: connection open Warning: Configuration not found at 'default'. Using default configuration. Do you mean 'default.json'? ⚠️ Error in reading JSON, attempting to repair JSON Error using json_repair: the JSON object must be str, bytes or bytearray, not NoneType ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 27, in choose_agent response = await create_chat_completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<9 lines>... ) ^ File "C:\Users\Admin\gpt-researcher\gpt_researcher\utils\llm.py", line 60, in create_chat_completion response = await provider.get_chat_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ messages, stream, websocket ^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\Admin\gpt-researcher\gpt_researcher\llm_provider\generic\base.py", line 116, in get_chat_response output = await self.llm.ainvoke(messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 307, in ainvoke llm_result = await self.agenerate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<8 lines>... ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 796, in agenerate_prompt return await self.agenerate( ^^^^^^^^^^^^^^^^^^^^^ prompt_messages, stop=stop, callbacks=callbacks, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 756, in agenerate raise exceptions[0] File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_core\language_models\chat_models.py", line 924, in _agenerate_with_cache result = await self._agenerate( ^^^^^^^^^^^^^^^^^^^^^^ messages, stop=stop, run_manager=run_manager, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\langchain_openai\chat_models\base.py", line 860, in _agenerate response = await self.async_client.create(**payload) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\resources\chat\completions.py", line 1661, in create return await self._post( ^^^^^^^^^^^^^^^^^ ...<41 lines>... ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\_base_client.py", line 1839, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\_base_client.py", line 1533, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ ...<5 lines>... ) ^ File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\openai\_base_client.py", line 1634, in _request raise self._make_status_error_from_response(err.response) from None openai.PermissionDeniedError: Error code: 403 - {'error': {'message': 'Project proj_2VxSRsTQaqjx5PDs2D0LEpindoes not have access to modelgpt-4o`', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 242, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 60, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\fastapi\applications.py", line 1054, in call
await super().call(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\middleware\errors.py", line 152, in call
await self.app(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\middleware\cors.py", line 77, in call
await self.app(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
raise exc
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 735, in app
await route.handle(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 362, in handle
await self.app(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 95, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
raise exc
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette_exception_handler.py", line 42, in wrapped_app
await app(scope, receive, sender)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\starlette\routing.py", line 93, in app
await func(session)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\site-packages\fastapi\routing.py", line 383, in app
await dependant.call(**solved_result.values)
File "C:\Users\Admin\gpt-researcher\backend\server\server.py", line 110, in websocket_endpoint
await handle_websocket_communication(websocket, manager)
File "C:\Users\Admin\gpt-researcher\backend\server\server_utils.py", line 121, in handle_websocket_communication
await handle_start_command(websocket, data, manager)
File "C:\Users\Admin\gpt-researcher\backend\server\server_utils.py", line 28, in handle_start_command
report = await manager.start_streaming(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
task, report_type, report_source, source_urls, tone, websocket, headers
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\Users\Admin\gpt-researcher\backend\server\websocket_manager.py", line 66, in start_streaming
report = await run_agent(task, report_type, report_source, source_urls, tone, websocket, headers = headers, config_path = config_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\gpt-researcher\backend\server\websocket_manager.py", line 108, in run_agent
report = await researcher.run()
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\gpt-researcher\backend\report_type\basic_report\basic_report.py", line 41, in run
await researcher.conduct_research()
File "C:\Users\Admin\gpt-researcher\gpt_researcher\agent.py", line 88, in conduct_research
self.agent, self.role = await choose_agent(
^^^^^^^^^^^^^^^^^^^
...<5 lines>...
)
^
File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 44, in choose_agent
return await handle_json_error(response)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 55, in handle_json_error
json_string = extract_json_with_regex(response)
File "C:\Users\Admin\gpt-researcher\gpt_researcher\actions\agent_creator.py", line 71, in extract_json_with_regex
json_match = re.search(r"{.*?}", response, re.DOTALL)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python313\Lib\re_init_.py", line 177, in search
return _compile(pattern, flags).search(string)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^
TypeError: expected string or bytes-like object, got 'NoneType'
INFO: connection closed
`

@ElishaKay
Copy link
Collaborator

ElishaKay commented Nov 13, 2024

Welcome @sackosindou

Looks like the meaningful part of the error message:

Error code: 403 - {'error': {'message': 'Project proj_2VxSRsTQaqjx5PDs2D0LEpindoes not have access to modelgpt-4o`', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

It looks like you'll want to test your LLM independently with this script

If you don't have access to gpt-4o, you can try one of these other configurations

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants