-
Notifications
You must be signed in to change notification settings - Fork 355
feat(tracing): add tracing to llm
and llm-base
crates
#367
Conversation
See #331. I would love to have some better tracing/performance stats ❤️ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No objections from me! Would you like to do more or should I merge this PR as-is?
We may also want to switch over llm-cli
and other applications to use tracing
, so that we don't have duplicate loggers.
I can update the CLI to use tracing as part of this PR as well, yeah. |
Cool, go ahead and switch things over to |
Signed-off-by: Radu Matei <[email protected]>
d4d9b22
to
c344592
Compare
Besides consuming the tracing data in an OTEL environment which is demonstrated above, you can now get basic logging with timestamps when using the
|
Excellent, thank you :) |
After this PR, run the following command will get nothing : cargo run --release -- info -a bloom --model-path models/ggml-model-f16.bin I had to add |
Hi, everyone!
First, thanks for the project!
This commit begins adding tracing and some instrumentation to this project.
As folks are beginning to look at using and optimising this project, I suspect this will be very useful.
Below is an example of sending the data to an open telemetry collector and visualising it:
This is a draft PR for now, as we understand whether this is something we would like to add to the project.
If it is, this can be expanded to include more robust logging and instrumentation.