Skip to content

Commit

Permalink
community[patch]: additional check for prompt caching support (langch…
Browse files Browse the repository at this point in the history
…ain-ai#29008)

Prompt caching explicitly excludes `gpt-4o-2024-05-13`:
https://platform.openai.com/docs/guides/prompt-caching

Resolves langchain-ai#28997
  • Loading branch information
ccurme authored Jan 3, 2025
1 parent 4de52e7 commit 0185010
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions libs/community/langchain_community/callbacks/openai_info.py
Original file line number Diff line number Diff line change
Expand Up @@ -204,8 +204,10 @@ def standardize_model_name(
or ("finetuned" in model_name and "legacy" not in model_name)
):
return model_name + "-completion"
if token_type == TokenType.PROMPT_CACHED and (
model_name.startswith("gpt-4o") or model_name.startswith("o1")
if (
token_type == TokenType.PROMPT_CACHED
and (model_name.startswith("gpt-4o") or model_name.startswith("o1"))
and not (model_name.startswith("gpt-4o-2024-05-13"))
):
return model_name + "-cached"
else:
Expand Down

0 comments on commit 0185010

Please sign in to comment.