Replies: 4 comments 8 replies
-
To resolve the issue where LlamaIndex does not recognize the 'gpt-4o' model in your Azure configuration, you need to ensure that the model is correctly specified and supported. Here is a solution to include 'gpt-4o' as a valid model:
from llama_index.llms.azure_openai import AzureOpenAIMultiModal
class CustomAzureOpenAIMultiModal(AzureOpenAIMultiModal):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.valid_models = [
"gpt-4-vision-preview",
"gpt-4-turbo-2024-04-09",
"gpt-4-turbo",
"gpt-4o",
"gpt-4o-2024-05-13",
"gpt-4" # Add this line
]
def _is_valid_engine(self, engine: Optional[str]) -> bool:
return engine in self.valid_models
# Use the custom class
mm_llm = CustomAzureOpenAIMultiModal(
model="gpt-4o",
deployment_name="gpt-4o",
azure_endpoint=azure_endpoint,
api_key=api_key,
openai_api_type="azure",
openai_api_version='2023-05-15',
temperature=0
)
response = mm_llm.complete(
prompt="Describe the image",
image_documents=image_path
)
llm = CustomAzureOpenAIMultiModal(
model="gpt-4o",
deployment_name="gpt-4o",
api_key=api_key,
azure_endpoint=azure_endpoint,
api_version=api_version,
temperature=0,
) This should resolve the |
Beta Was this translation helpful? Give feedback.
-
@dosu. Not working. I am getting this below error.
|
Beta Was this translation helpful? Give feedback.
-
@dosu Error again :-(
|
Beta Was this translation helpful? Give feedback.
-
This is issue with utilis.py (the file in the webiste has up to date with all models. However, my pc file is not updated despite I upgrade the package. I replace with below and error has been resolved. Closing the ticket. |
Beta Was this translation helpful? Give feedback.
-
We configured gpt-4o model in our environment and it is working fine for other packages. However, it is not working for llamaindex. Any thoughts?
My config:
File ~\AppData\Roaming\Python\Python311\site-packages\llama_index\core\response_synthesizers\factory.py:73, in get_response_synthesizer(llm, prompt_helper, service_context, text_qa_template, refine_template, summary_template, simple_template, response_mode, callback_manager, use_async, streaming, structured_answer_filtering, output_cls, program_factory, verbose)
67 prompt_helper = service_context.prompt_helper
68 else:
69 prompt_helper = (
70 prompt_helper
71 or Settings._prompt_helper
72 or PromptHelper.from_llm_metadata(
---> 73 llm.metadata,
74 )
75 )
...
207 f" {', '.join(ALL_AVAILABLE_MODELS.keys())}"
208 )
209 return ALL_AVAILABLE_MODELS[modelname]
ValueError: Unknown model 'gpt-4o'. Please provide a valid OpenAI model name in: gpt-4, gpt-4-32k, gpt-4-1106-preview, gpt-4-0125-preview
Beta Was this translation helpful? Give feedback.
All reactions