Skip to content

Commit

Permalink
refactor(ai): apply some small improvements (#612)
Browse files Browse the repository at this point in the history
This commit fixes some things that were unclear and caused people
confusion.
  • Loading branch information
rickstaa authored Jul 31, 2024
1 parent 6ebde1c commit dca9e00
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 5 deletions.
4 changes: 2 additions & 2 deletions ai/orchestrators/models-config.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,12 @@ currently **recommended** models and their respective prices.
[
{
"pipeline": "text-to-image",
"model_id": "ByteDance/SDXL-Lightning",
"model_id": "SG161222/RealVisXL_V4.0_Lightning",
"price_per_unit": 4768371
},
{
"pipeline": "image-to-image",
"model_id": "ByteDance/SDXL-Lightning",
"model_id": "timbrooks/instruct-pix2pix",
"price_per_unit": 4768371
},
{
Expand Down
7 changes: 4 additions & 3 deletions ai/orchestrators/start-orchestrator.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ Please follow the steps below to start your AI Subnet Orchestrator node:
- `-aiModels`: This flag sets the path to the JSON file that contains the AI models.
- `-aiModelsDir`: This flag indicates the directory where the AI models are stored on the host machine.
- `-aiRunnerImage`: This optional flag specifies which version of the ai-runner image is used. Example: `livepeer/ai-runner:0.0.2`

Moreover, the `--network host` flag facilitates communication between the AI Orchestrator and the AI Runner container.

<Warning>Please note that since we use [docker-out-of-docker](https://tdongsi.github.io/blog/2017/04/23/docker-out-of-docker/), the `aiModelsDir` path should be defined as being on the host machine.</Warning>
Expand Down Expand Up @@ -203,8 +203,9 @@ Once the AI Subnet Orchestrator node is up and running, validate its operation
by sending an AI inference request directly to the
[ai-runner](https://hub.docker.com/r/livepeer/ai-runner) container. The most
straightforward way to do this is through the
[swagger UI](https://fastapi.tiangolo.com/features/) interface, accessible at
`http://localhost:8000/docs`.
[Swagger UI](https://fastapi.tiangolo.com/features/) interface, accessible at
`http://localhost:8000/docs` if you have loaded the `text-to-image` pipeline. Note
that other pipelines will have different endpoints.
![Swagger UI interface](/images/ai/swagger_ui.png)
Expand Down

0 comments on commit dca9e00

Please sign in to comment.