How to use Azure open ai with kaizen? #261
Replies: 7 comments
-
I updated my .env. {
"language_model": {
"provider": "litellm",
"enable_observability_logging": true,
"model_config": {
"model": "azure/gpt-4o",
"input_cost_per_token": 0.0000005,
"output_cost_per_token": 0.0000015
},
"models": {
"best": {
"model": "azure/gpt-4o",
"input_cost_per_token": 0.000005,
"output_cost_per_token": 0.000015,
"type": "best"
},
"default": {
"model": "azure/gpt-4o",
"input_cost_per_token": 0.0000005,
"output_cost_per_token": 0.0000015,
"type": "default"
}
}
},
"github_app": {
"check_signature": false,
"auto_pr_review": true,
"edit_pr_desc": true
}
} But i have the following error as a result python generate.py
Couldnt find config at config.json loading default vals
Generating UI tests for `https://cloudcode.ai`, please wait...
Installing dependencies...
Switching to root user to install dependencies...
Hit:2 https://download.docker.com/linux/ubuntu jammy InRelease
Hit:1 https://apt.llvm.org/jammy llvm-toolchain-jammy-17 InRelease
Hit:4 http://archive.ubuntu.com/ubuntu jammy InRelease
Hit:5 http://security.ubuntu.com/ubuntu jammy-security InRelease
Hit:3 https://packagecloud.io/github/git-lfs/ubuntu jammy InRelease
Hit:6 http://archive.ubuntu.com/ubuntu jammy-updates InRelease
Hit:7 http://archive.ubuntu.com/ubuntu jammy-backports InRelease
Hit:8 https://ppa.launchpadcontent.net/git-core/ppa/ubuntu jammy InRelease
Hit:9 https://ppa.launchpadcontent.net/ondrej/apache2/ubuntu jammy InRelease
Hit:10 https://ppa.launchpadcontent.net/ondrej/nginx-mainline/ubuntu jammy InRelease
Hit:11 https://ppa.launchpadcontent.net/ondrej/php/ubuntu jammy InRelease
Reading package lists... Done
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
fonts-freefont-ttf is already the newest version (20120503-10build1).
fonts-liberation is already the newest version (1:1.07.4-11).
....
....
....
xvfb is already the newest version (2:21.1.4-2ubuntu1.7~22.04.10).
libxml2 is already the newest version (2.9.14+dfsg-0.1+ubuntu22.04.1+deb.sury.org+1).
0 upgraded, 0 newly installed, 0 to remove and 39 not upgraded.
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Error: litellm.AuthenticationError: AuthenticationError: OpenAIException - Traceback (most recent call last):
File "/workspace/.pyenv_mirror/user/current/lib/python3.12/site-packages/litellm/llms/openai.py", line 825, in completion
raise e
File "/workspace/.pyenv_mirror/user/current/lib/python3.12/site-packages/litellm/llms/openai.py", line 762, in completion
openai_client = self._get_openai_client(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/.pyenv_mirror/user/current/lib/python3.12/site-packages/litellm/llms/openai.py", line 639, in _get_openai_client
_new_client = openai(
^^^^^^^
File "/workspace/.pyenv_mirror/user/current/lib/python3.12/site-packages/openai/_client.py", line 104, in __init__
raise openaiError(
**openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable** |
Beta Was this translation helpful? Give feedback.
-
Hey I think your filename maybe wrong. We use Else share the output of this command from your folder:
|
Beta Was this translation helpful? Give feedback.
-
Hi, I am using config.json Output of ├── cli
│ ├── kaizen_cli
│ ├── poetry.lock
│ ├── pyproject.toml
│ ├── README.md
│ └── tests
├── CODE_OF_CONDUCT.md
├── config.json
├── docker-compose-dev.yml
├── docker-compose.yml
├── Dockerfile
├── docs
│ ├── next.config.js
│ ├── package.json
│ ├── package-lock.json
│ ├── pages
│ └── theme.config.jsx
├── examples
│ ├── basic
│ ├── ui_review
│ └── work_summarizer
├── github_app
│ ├── github_helper
│ └── main.py
├── kaizen
│ ├── actors
│ ├── generator
│ ├── helpers
│ ├── __init__.py
│ ├── integrations
│ ├── llms
│ ├── __pycache__
│ ├── reviewer
│ └── utils
├── LICENSE
├── poetry.lock
├── pyproject.toml
├── README.md
└── tests
├── actions
├── data
├── helpers
├── __init__.py
└── llms
25 directories, 19 files |
Beta Was this translation helpful? Give feedback.
-
I think there is a possibility that your environment variable is not set properly. Can you try running this command:
and then trying to run the above code? |
Beta Was this translation helpful? Give feedback.
-
I tried the command, i had the same result (ie. error): set -a; source .env; set +a;
bash: os.environ[AZURE_API_KEY]=bbbxxxxxxxxxxxxxxxxxxxxxxx9b: command not found
bash: os.environ[AZURE_API_BASE]=https://xxx-xxxxxxxxx-noxxxxxxxx.openai.azure.com/: No such file or directory
bash: os.environ[AZURE_API_VERSION]=2023-05-15: command not found |
Beta Was this translation helpful? Give feedback.
-
Do you want to connect on a call? Join this discord, and maybe we can hop on a quick call to solve this. |
Beta Was this translation helpful? Give feedback.
-
Head to the Using Custom LLMs section at https://cloudcode.ai/kaizen/docs/ |
Beta Was this translation helpful? Give feedback.
-
This is a great question that Edwin asked.
So, we use litellm providers internally, which allows us to use any LLM providers.
For Azure, here are the steps:
Beta Was this translation helpful? Give feedback.
All reactions