Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible memory leak #191

Open
Risers opened this issue Dec 2, 2024 · 2 comments
Open

Possible memory leak #191

Risers opened this issue Dec 2, 2024 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@Risers
Copy link

Risers commented Dec 2, 2024

Recently I was trying to stop my app from allocating memory infinitely. After reducing the amount of code and leaving only problematic parts I was left with something like this:

use std::str::FromStr;
use axum::routing::get;
use axum::Router;
use http::StatusCode;
use tracing::{info};
use tracing_subscriber::layer::SubscriberExt;
use tracing_subscriber::{EnvFilter, Layer};
use axum_tracing_opentelemetry::middleware::{OtelAxumLayer, OtelInResponseLayer};
use dotenvy::dotenv;
use init_tracing_opentelemetry::tracing_subscriber_ext::build_otel_layer;

#[tokio::main]
async fn main() {
    dotenv().ok();
    info!("starting server");
    serve().await.expect("server error");
}

async fn serve() -> anyhow::Result<()> {
    let otel_log_level = EnvFilter::from_str(format!("{}", "debug").as_str())
        .expect("error parsing otel log level from config file")
        .add_directive("otel::tracing=trace".parse()?);
    let (layer, guard) = build_otel_layer()?;
    let subscriber = tracing_subscriber::registry()
        .with(layer.with_filter(otel_log_level));
    tracing::subscriber::set_global_default(subscriber)?;


    let app = Router::new()
        .route("/healthz", get(healthz))
        .layer(OtelInResponseLayer::default())
        .layer(OtelAxumLayer::default());

    let listener = tokio::net::TcpListener::bind("0.0.0.0:8080").await?;
    info!("server listening on {}", listener.local_addr()?);
    axum::serve(listener, app).await?;
    Ok(())
}

async fn healthz() -> Result<String, StatusCode> {
    Ok("ok".to_string())
}

My env looks like this:
OTEL_SERVICE_NAME=test OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://localhost:4317 OTEL_EXPORTER_OTLP_TRACES_PROTOCOL="grpc"

I'm using jeager all in one container. The data is reaching it correctly.

When using this python script:

import aiohttp
import asyncio
async def fetch(session, url):
    """Perform a single GET request."""
    async with session.get(url) as response:
        return await response.text(), response.status

async def perform_requests(session, url, num_requests):
    """Perform GET requests asynchronously."""
    tasks = [fetch(session, url) for _ in range(num_requests)]
    return await asyncio.gather(*tasks)

async def main():
    url = "http://localhost:8080/healthz"
    num_requests = 1000
    async with aiohttp.ClientSession() as session:

        print(f"Performing {num_requests} GET requests to {url}...")
        for _ in range(1000000):
            results = await perform_requests(session, url, num_requests)


if __name__ == "__main__":
    asyncio.run(main())

I see that the memory used by my rust application is increasing a megabyte every few seconds. Seemingly infinitely until using all of my computer's memory.

Library versions:
axum-tracing-opentelemetry = "0.24.1" init-tracing-opentelemetry = { version = "0.24.1", features = ["tracing_subscriber_ext", ] }

@davidB davidB self-assigned this Dec 4, 2024
@davidB davidB added the bug Something isn't working label Dec 4, 2024
@davidB
Copy link
Owner

davidB commented Dec 4, 2024

Thank you for reporting and the analysis. I'll take a look ASAP.

@marcin-ptaszynski
Copy link

@davidB the issue seems to be grpc transport-related. After changing settings to:

OTEL_EXPORTER_OTLP_TRACES_PROTOCOL=http/protobuf
OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=http://localhost:4318/v1/traces

The memory leak is no longer present.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants