Replies: 2 comments
-
|
Beta Was this translation helpful? Give feedback.
-
well - actually @localai-bot is correct here. the |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
well - actually @localai-bot is correct here. the |
Beta Was this translation helpful? Give feedback.
-
Is your feature request related to a problem? Please describe.
I am using the following configuration and a downloaded model as I want to deploy on an instance with no internet access
However I get the following error at inference :
I indeed specified a bin file, and my other models work well so it should in theory look into the correct folder.
Describe the solution you'd like
If there is a file extension load a local model, or add a parameter for that.
Beta Was this translation helpful? Give feedback.
All reactions