Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help. Building llama.cpp Library #2

Closed
pavelprosto94 opened this issue Jan 21, 2024 · 11 comments
Closed

Help. Building llama.cpp Library #2

pavelprosto94 opened this issue Jan 21, 2024 · 11 comments
Assignees
Labels
bug Something isn't working

Comments

@pavelprosto94
Copy link

Please tell us in more detail how to compile llama.cpp?
Which files should I copy and where should I copy them(I didn't find any *.so files)?

I try do this:

cd ~/.pub-cache/hosted/pub.dev/llama_cpp_dart-0.0.4
git clone https://github.com/ggerganov/llama.cpp.git
mkdir build
cd build
cmake .. -DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS
cmake --build . --config Release
@netdur
Copy link
Owner

netdur commented Jan 21, 2024

what platform? generally add -DBUILD_SHARED_LIBS=ON to cmake ..

@pavelprosto94
Copy link
Author

pavelprosto94 commented Jan 21, 2024

platform Linux, Flutter project

@pavelprosto94
Copy link
Author

the compilation was successful and I received 2 files - libggml_shared.so and libllama.so
And to which directory should they be copied?

@netdur
Copy link
Owner

netdur commented Jan 21, 2024

place libllama.so in root folder of your project

@netdur netdur self-assigned this Jan 21, 2024
@netdur netdur added the enhancement New feature or request label Jan 21, 2024
@pavelprosto94
Copy link
Author

pavelprosto94 commented Jan 21, 2024

error on click load model

Launching lib/main.dart on Linux in debug mode...

(boris:88419): Gdk-CRITICAL **: 22:21:13.306: gdk_window_get_state: assertion 'GDK_IS_WINDOW (window)' failed
Connecting to VM Service at ws://127.0.0.1:42281/GRcPMlo5nE8=/ws
[ERROR:flutter/runtime/dart_isolate.cc(1097)] Unhandled exception:
Invalid argument(s): Failed to lookup symbol 'llama_backend_init': /home/prosto/Sync/Projects/Flutter/boris/build/linux/x64/debug/bundle/lib/libflutter_linux_gtk.so: undefined symbol: llama_backend_init
#0      DynamicLibrary.lookup (dart:ffi-patch/ffi_dynamic_library_patch.dart:33:70)
#1      llama_cpp._llama_backend_initPtr (package:llama_cpp_dart/src/llama_cpp.dart:10187:63)
#2      llama_cpp._llama_backend_initPtr (package:llama_cpp_dart/src/llama_cpp.dart)
#3      llama_cpp._llama_backend_init (package:llama_cpp_dart/src/llama_cpp.dart:10190:7)
#4      llama_cpp._llama_backend_init (package:llama_cpp_dart/src/llama_cpp.dart)
#5      llama_cpp.llama_backend_init (package:llama_cpp_dart/src/llama_cpp.dart:10181:12)
#6      new Llama (package:llama_cpp_dart/src/llama.dart:63:9)
#7      LlamaProcessor._modelIsolateEntryPoint.<anonymous closure> (package:llama_cpp_dart/src/llama_processor.dart:89:21)
#8      _RootZone.runUnaryGuarded (dart:async/zone.dart:1594:10)
#9      _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:339:11)
#10     _BufferingStreamSubscription._add (dart:async/stream_impl.dart:271:7)
#11     _SyncStreamControllerDispatch._sendData (dart:async/stream_controller.dart:784:19)
#12     _StreamController._add (dart:async/stream_controller.dart:658:7)
#13     _StreamController.add (dart:async/stream_controller.dart:606:5)
#14     _RawReceivePort._handleMessage (dart:isolate-patch/isolate_patch.dart:184:12)

@netdur netdur added bug Something isn't working and removed enhancement New feature or request labels Jan 21, 2024
netdur added a commit that referenced this issue Jan 21, 2024
* References #2 to support loading shared library on linux
@netdur
Copy link
Owner

netdur commented Jan 21, 2024

@pavelprosto94, instead of spending time figuring out where Dart searches for shared libraries, I've introduced a libraryPath property to the Llama class. This allows for setting the full path of the library directly.
Llama.libraryPath = "...";

@netdur
Copy link
Owner

netdur commented Jan 21, 2024

@pavelprosto94, please feel free to reopen the issue if the proposed solution doesn't resolve your problem.

@netdur netdur closed this as completed Jan 21, 2024
@danemadsen
Copy link
Contributor

Hi @netdur
You can build llama.cpp as a shared library as part of the dart build process, as i have done in maid:

https://github.com/Mobile-Artificial-Intelligence/maid

This works for Linux, Windows and Linux but not MacOS and IOS. I can help you implement this logic if you'd like. You package seems like it may be able to replace my own implementation.

@netdur
Copy link
Owner

netdur commented Jan 23, 2024

@pavelprosto94 potentially v0.0.6 may resolve your issue.

@NavodPeiris
Copy link

@pavelprosto94 did you successfully build this project? could you overcome #33 ?

@Vinayak006
Copy link

place libllama.so in root folder of your project

@netdur, Yeah I have placed it in root folder and also mentioned the librarypath as "/home/user/llama_cpp_app/libllama.so". But still I got Failed to load error.

Failed to load dynamic library 'libllama.so': dlopen failed: library "libggml.so" not found: needed by /data/app/~~vsx70-PCu9HEgN69Rg13NQ==/com.example.llama_cpp_app-pngGCil9UMpOP0g4TnVHSA==/base.apk!/lib/arm64-v8a/libllama.so in namespace classloader-namespace

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants