What endpoint to use for embeddings? #1089
Answered
by
jamesbraza
jamesbraza
asked this question in
Q&A
-
I am using the https://localai.io/features/embeddings/ talks about the
What endpoint should be used for embeddings? Can you provide me a |
Beta Was this translation helpful? Give feedback.
Answered by
jamesbraza
Sep 21, 2023
Replies: 1 comment
-
Okay, I figured this out: > curl http://localhost:8080/embeddings -X POST -H "Content-Type: application/json" -d '{
"input": "Your text string goes here",
"model": "bert-embeddings"
}' | jq "."
...
{
"object": "list",
"model": "bert-embeddings",
"data": [
{
"embedding": [
0.051830754,
...,
-0.023937061
],
"index": 0,
"object": "embedding"
}
],
"usage": {
"prompt_tokens": 0,
"completion_tokens": 0,
"total_tokens": 0
}
} What do you think of me adding this to the docs? |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
jamesbraza
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Okay, I figured this out:
What do you think of me adding this to the docs?