Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR]: Add VertexAI Gemini Support to Opik #312

Open
pleomax0730 opened this issue Sep 24, 2024 · 3 comments
Open

[FR]: Add VertexAI Gemini Support to Opik #312

pleomax0730 opened this issue Sep 24, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@pleomax0730
Copy link

pleomax0730 commented Sep 24, 2024

Willingness to contribute

No. I can't contribute this feature at this time.

Proposal summary

I would like to request adding support for VertexAI’s Gemini models in Opik. This would allow users like me to use Gemini models for testing, monitoring, and evaluating our LLM applications within Opik. Adding support in LangChain would also be much appreciated.

Motivation

Why This is Important:

  • More Model Options: Adding VertexAI Gemini means we can work with more powerful models alongside the ones already available in Opik.
  • Easy Experimentation: We can easily test our LLM applications using Gemini models and compare results with other models using Opik’s tools.
  • Improved Monitoring: It would help track and monitor Gemini model performance in production, using Opik’s tracing and error logging.
  • CI/CD Integration: We could include Gemini model evaluations in our CI/CD workflows to ensure everything works well before deploying.
@pleomax0730 pleomax0730 added the enhancement New feature or request label Sep 24, 2024
@jverre
Copy link
Collaborator

jverre commented Sep 25, 2024

Hi @pleomax0730

Thanks for the suggestion, we will take a look

Can you share a little bit more about how you use Vertex and maybe share a code snippet or two ? This will help make sure we create the right integration

@tarrade
Copy link

tarrade commented Oct 10, 2024

Hi @jverre,

I am also interested to have the support for Google models (LLMs like Gemini 1.5 Pro or Flash) using the Vertex AI python SDK. Here is the link to the Vertex AI python SDK that cover all the ML ecosystem on Google Cloud Platform[1][2].

Here is a simple example using LLMs from Google on GCP [3]. Of course this require an account on GCP.

Thanks

[1] https://github.com/googleapis/python-aiplatform
[2] https://cloud.google.com/python/docs/reference/aiplatform/latest
[3] https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/getting-started/intro_gemini_1_5_flash.ipynb

@jverre
Copy link
Collaborator

jverre commented Oct 11, 2024

Thanks @tarrade, we are looking at releasing a first party integration with Gemini

In the meantime, are you familiar with LiteLLM ? It's a really nice package that allows you to query many different model providers in a uniform format and we have an integration with it. So you could do:

# Configure the Opik logger for LiteLLM
from litellm.integrations.opik.opik import OpikLogger
import litellm

opik_logger = OpikLogger()
litellm.callbacks = [opik_logger]

## GET CREDENTIALS 
## RUN ## 
# !gcloud auth application-default login - run this to add vertex credentials to your env
## OR ## 
file_path = 'path/to/vertex_ai_service_account.json'

# Load the JSON file
with open(file_path, 'r') as file:
    vertex_credentials = json.load(file)

# Convert to JSON string
vertex_credentials_json = json.dumps(vertex_credentials)

## COMPLETION CALL 
response = litellm.completion(
  model="vertex_ai/gemini-pro",
  messages=[{ "content": "Hello, how are you?","role": "user"}],
  vertex_credentials=vertex_credentials_json
)

You can learn more about it in the LiteLLM docs and the Opik docs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants