Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add trace-id as a field for other tracing layers #625

Closed
bjchambers opened this issue Aug 31, 2021 · 18 comments
Closed

Add trace-id as a field for other tracing layers #625

bjchambers opened this issue Aug 31, 2021 · 18 comments
Labels
A-trace Area: issues related to tracing enhancement New feature or request release:after-stable Not required before Stable release, and not going to work on before Stable Release

Comments

@bjchambers
Copy link

I'm trying to set logging and tracing up for use with Grafana Loki and Grafana Tempo. I'd like to be able to get the otel Trace ID and include it in the logs. I don't see any way to do this.

I'm using:

  • tracing - to create spans / events within my Rust code
  • tracing-subscriber to add two subscribers, including one to format logs as JSON.
  • tracing-opentelemetry to use opentelemetry as a subscriber.
  • opentelemetry and opentelemetry-otlp to report traces to an OTEL collector.

I've looked in a variety of places:

  1. Configuring the JSON formatter from tracing-subscriber to add the trace ID. I was thinking I could then get the opentelemetry Context which should have the trace ID and just add that as a field. But, I couldn't find a way to hook into the JSON formatting and add this code.
  2. Configuring the opentelemetry layer to add these fields, thinking that maybe they would just show up. I couldn't find a way to configure this.
  3. Adding another layer that added the fields based on retrieving them from opentelemetry. Also couldn't find this.

Is this something that is currently possible? If so, are there any examples how to have the logs include the trace ID? If it's not possible, any thoughts on where it could be added? I'm happy to take a stab at it if it's not already possible.

@TommyCpp
Copy link
Contributor

Generally, it's not possible directly. Without tracing, you can get the current span by get_active_span method. With tracing. Things may be a little complicated. The end goal is to build the log part of opentelemetry here but it may be worth it to explore the option to include trace id and span id in tracing's JSON subscribers.

Personally, I found method 1 should be most reasonable but may also need some input from tracing side.

@jtescher any thoughts?

@jtescher
Copy link
Member

Not sure that there is a great way of doing this currently, you can manually get an otel context from tracing via Span::current().context() and then get the id via context.span().span_context().trace_id(), which would then let you add that in any log locations, but it's not very ergonomic.

@bjchambers
Copy link
Author

Would it be possible to add some kind of hook to the log formatter or the log layer that retrieved additional fields? Then I could use that to add a hook that retrieved the otel trace and added that when formatting.

@TommyCpp
Copy link
Contributor

TommyCpp commented Sep 1, 2021

Would it be possible to add some kind of hook to the log formatter or the log layer that retrieved additional fields?

That sounds like an interesting topic to explore. We could try something on the tracing side

@bjchambers
Copy link
Author

Filed an issue in tracing to see if this would be possible to add over there.

@TommyCpp TommyCpp added the A-trace Area: issues related to tracing label Jan 9, 2022
@bnjjj
Copy link
Contributor

bnjjj commented Nov 17, 2022

I recently did it in a text formatter here: https://github.com/apollographql/router/blob/dev/apollo-router/src/plugins/telemetry/formatters/text.rs#L120 if it can help

@hdost hdost added this to the Tracing API And SDK Stable milestone Mar 2, 2023
@hdost hdost added the release:after-stable Not required before Stable release, and not going to work on before Stable Release label Aug 1, 2023
@cijothomas
Copy link
Member

@bjchambers Pinging on an old issue in light of new developments!
https://github.com/open-telemetry/opentelemetry-rust/tree/main/opentelemetry-appender-tracing is available now, which would allow logs from tracing to be exported as logrecords. You can use OTLP log Exporter to send to OTLP endpoints, including Loki (possibly with Loki exporter from collector)

@jonkight
Copy link

I also need trace_ids to exist across traces and logs, and agree puling from a span context is not ergonomic.

Pinging on an old issue in light of new developments! https://github.com/open-telemetry/opentelemetry-rust/tree/main/opentelemetry-appender-tracing is available now, which would allow logs from tracing to be exported as logrecords. You can use OTLP log Exporter to send to OTLP endpoints, including Loki (possibly with Loki exporter from collector)

@cijothomas I actually tried to get this to work earlier this week. I was unsuccessful because I couldn't get an OLTP exporter (like tonic in this example) to work without using code on the main branch, which won't work with latest tracing_opentelemetry because of the breaking changes on the main branch (e.g. Tracer trait was moved crates).

Technically, I believe the basic opentelemetry-appender-tracing example would work if I were simply scraping from the logs created by stdout via a container daemon, but I'd prefer to send them to the opentelemetry collector over tonic, similar to traces.

Did I miss something with my exploration or do I need to wait until new releases?

@cijothomas
Copy link
Member

basic opentelemetry-appender-tracing example would work if I were simply scraping from the logs created by stdout via a container daemon

No, thats not how it works. It has nothing to do with stdout. opentelemetry-appender-tracing leverages opentelemetr's logging bridge api and converts tracing events to OTel's logrecord. And then it can be exported to whatever exporter is configured : OTLP or stdout or inmemory or something else.

https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/src/main.rs#L65 This should work as-is, but if not working - can you list the exact steps you followed so we can fix that example?

tracing_opentelemetry being broken could be temporary as it has to adjust to breaking changes from this repo.. Do you have a tracking issue in that repo for this?

@jonkight
Copy link

jonkight commented Oct 20, 2023

No, thats not how it works.

@cijothomas Apologies, let me try restating my problem in a different way.

I tried to use this example, but with an exporter like tonic and not stdout. I could not figure out how to do this without using changes on otel main due to the lack of any examples. Because I was now referencing main directly via the cargo git hint, I could not use tracing_opentelemetry because the compatibility was broken due to refactors. Things being incompatible is expected given I was not using published crates and I'm sure tracing_opentelemetry will address the breaking changes otel has created once new crates are published on the otel side.

The code I fiddled with looked something like this:

    let test = opentelemetry_sdk::logs::LoggerProvider::builder()
        .with_batch_exporter(
            opentelemetry_otlp::new_exporter().tonic().with_env(),
            opentelemetry::runtime::Tokio,
        )
        .build();

The error was:

the trait opentelemetry_sdk::export::logs::LogExporter is not implemented for TonicExporterBuilder

I found this on main which I thought was relevant for future releases, which is why I decided to play with it, but ended up hitting the mentioned incompatibility blocker.

With that said, let me take an even further step back since I could be misunderstanding the purpose of the appender, as you noted. I am using the tracing crate and would like to take tracing events and send them to the opentelemetry collector as logs using a tonic exporter, similar to how I have successfully done with traces. I assumed that's what the appender could help me with.

Are there any working examples the use case I'm after with currently released crates, is it not possible until a future release, or am I misunderstanding the purpose of the appender?

@cijothomas
Copy link
Member

I am using the tracing crate and would like to take tracing events and send them to the opentelemetry collector as logs using a tonic exporter, similar to how I have successfully done with traces. I assumed that's what the appender could help me with.

Yes this is exactly the purpose of opentelemetry-appender-tracing!

The examples from https://github.com/open-telemetry/opentelemetry-rust/tree/main/opentelemetry-otlp/examples is supposed to help with that! These should be working examples. Apart from the incompatibility issue with tracing-opentelemetry, are there other issues you faced?

Separately, the tracing-opentelemetry could duplicate the telemetry as it can sent the events as SpanEvents, and the appender can send the events as LogRecords. This is something that need fix, tracked by #1111

@jonkight
Copy link

@cijothomas I could be mistaken, but those examples seem to never set the GlobalLoggingProvider and rely on a the default global one, which is a NoopLoggingProvider (here). I haven't tested the examples, but my guess would be that logs in these examples would do nothing given the naming choice of Noop.

Comparing this example to the example in the opentelemetry-appender-tracing, I still feel there's a problem of tying things together. In the opentelemetry-appender-tracing example, there's a LoggerProvider::builder() in which you can provide an exporter, but currently there is not a published tonic / batch exporter that meets the trait bounds (I could be wrong here). In the example you linked, there's not a provider or bridge that can interact with the tracing crate.

If you know of or can provide any working examples for opentelemetry-appender-tracing crate that export to an OTEL endpoint over tonic, I'd love to see them.

@cijothomas
Copy link
Member

cijothomas commented Oct 23, 2023

https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp-http/src/main.rs

These examples should work! i.e the logs emitted using info!, and other macros, would land in OTEL Collector as LogRecords
@lalitb added it a week ago.

@jonkight
Copy link

@cijothomas I appreciate the quick reply. I'll try and find time to play with it in the near future.

@mayakerostasia
Copy link

mayakerostasia commented Dec 5, 2023

I still haven't been able to get this example to work =/

Here's kinda what's gotten close but I still don't have any connection between the logs and the traces

https://gist.github.com/nwisemanII/05f1e0e879133d51f6edfcf04a7f5a8d

(pretty picture ig? :
image
)

edit: ( addition )

The only other thing I can possibly think of is doing something really strange like exporting to an InMemory log and then replaying that into the tracer? Something along those lines, I figured there was probably an easier way to do it.

@germankuber
Copy link

germankuber commented Mar 5, 2024

Hello guys. I have the same issue. I am using this example (https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/src/main.rs#L65) and works perfectly.
When I use mi logs (info!, warn!, etc) inside of
tracer.in_span("Sub operation...", |cx| { } the trace_id and span_id are added to the log without any problem.
The problem is when I want to use #[tracing::instrument]. in this case the context is not added to the log.

Any idea how can I add the context automatically to my logs?

@lalitb
Copy link
Member

lalitb commented Mar 5, 2024

@germankuber We have an open issue #1378 for the problem. There is no solution as of now, though one of the important issues to be resolved. There is some work done #1394 if someone wants to continue on that. Else please circle back in couple of months.

@cijothomas
Copy link
Member

I'll close this issue as we have #1378 to track this specific issue and also #1571 for overall interop story with tokio-tracing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-trace Area: issues related to tracing enhancement New feature or request release:after-stable Not required before Stable release, and not going to work on before Stable Release
Projects
None yet
Development

No branches or pull requests

10 participants