-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #278 from janhq/update-submodule-2024-11-01-17-07
Update llama.cpp submodule to latest release b4007
- Loading branch information
Showing
1 changed file
with
1 addition
and
1 deletion.
There are no files selected for viewing
Submodule llama.cpp
updated
12 files
+13 −24 | examples/server/server.cpp | |
+47 −5 | examples/server/utils.hpp | |
+3 −11 | ggml/include/ggml.h | |
+3 −1 | ggml/src/CMakeLists.txt | |
+1 −1 | ggml/src/ggml-backend.cpp | |
+4 −2 | ggml/src/ggml-cuda.cu | |
+42 −0 | ggml/src/ggml-kompute.cpp | |
+54 −118 | ggml/src/ggml.c | |
+9 −0 | ggml/src/kompute-shaders/common.comp | |
+133 −0 | ggml/src/kompute-shaders/op_mul_mat_q4_k.comp | |
+1 −1 | scripts/sync-ggml.last | |
+122 −93 | src/llama.cpp |