Replies: 1 comment
-
Hi there, you should use with bentoml.models.create("my-mlflow-model") as bentomodel:
# <save logics with mlflow code here> Then in your @bentoml.service(
resources={"cpu": "2"},
)
class IrisModel:
mlflow_model = bentoml.models.get('my-mlflow-model')
def __init__(self) -> None:
self.model = mlflow.load(self.mlflow_model.path)
@bentoml.api
def predict(self, input_data: list):
rv : np.ndarray = self.model.predict(np.array(input_data))
return rv.tolist() For bentofile I think you can just do this envs:
- name: DATABRICKS_HOST
- name: DATABRICKS_TOKEN No need to put the value there, you can inject the value during runtime or deployment config later. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have set up a simple Bento project and i need to download model artifacts from Azure Databricks. This simple
download_model.py
does the downloading:This is my Bento Service
service.py
:and the Bentofile:
The Bento is built and containerized via Github Actions:
This works, but i have 2 concerns:
So, I tried to separate download and service logic with something like this:
Bentofile:
workflow:
... but then, how do i get the downloaded model into the local BentoML Model Store of the created container? Can you provide some best practice here? Thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions