-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Help. Building llama.cpp Library #2
Comments
what platform? generally add |
platform Linux, Flutter project |
the compilation was successful and I received 2 files - libggml_shared.so and libllama.so |
place libllama.so in root folder of your project |
error on click load model
|
@pavelprosto94, instead of spending time figuring out where Dart searches for shared libraries, I've introduced a libraryPath property to the Llama class. This allows for setting the full path of the library directly. |
@pavelprosto94, please feel free to reopen the issue if the proposed solution doesn't resolve your problem. |
Hi @netdur https://github.com/Mobile-Artificial-Intelligence/maid This works for Linux, Windows and Linux but not MacOS and IOS. I can help you implement this logic if you'd like. You package seems like it may be able to replace my own implementation. |
@pavelprosto94 potentially v0.0.6 may resolve your issue. |
@pavelprosto94 did you successfully build this project? could you overcome #33 ? |
@netdur, Yeah I have placed it in root folder and also mentioned the librarypath as "/home/user/llama_cpp_app/libllama.so". But still I got Failed to load error.
|
Please tell us in more detail how to compile llama.cpp?
Which files should I copy and where should I copy them(I didn't find any *.so files)?
I try do this:
The text was updated successfully, but these errors were encountered: