Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High memory usage #3269

Closed
schurmann opened this issue Aug 14, 2023 · 4 comments
Closed

High memory usage #3269

schurmann opened this issue Aug 14, 2023 · 4 comments
Assignees
Labels
awaiting user waiting for user to respond

Comments

@schurmann
Copy link

Brief summary

I'm evaluating K6 as a load testing framework and it looks really promising. However, the high memory usage is one of the things holding me back when running a soak test. The memory usage is almost monotonically increasing during the test and will eventually be too much. I'm following the recommendations with no --no-summary --no-thresholds and using prometheus as an --out sink.

k6 version

0.45

OS

Amazon Linux (EKS)

Docker version and image (if applicable)

grafana/k6:0.45.0

Steps to reproduce the problem

K6 parameters:

spec:
  containers:
  - args:
    - run
    - --no-summary
    - --no-thresholds
    - --config
    - /k6-scripts/config.json
    - --out
    - experimental-prometheus-rw
    - /k6-scripts/test.js
    env:
    - name: K6_PROMETHEUS_RW_SERVER_URL
      value: <URL>
    - name: K6_PROMETHEUS_RW_TREND_STATS
      value: p(50),p(95),p(99)
    - name: K6_PROMETHEUS_RW_PUSH_INTERVAL
      value: 1m
export default function (data) {
    const options = {
        headers: {
            Authorization: data.authorization
        },
    };
    const res = http.get(target, options);
    check(res, {
        'protocol is HTTP/2': (r) => r.proto === 'HTTP/2.0',
        'status is 200': (r) => r.status === 200,
    });
}
{
    "discardResponseBodies": true,
    "scenarios": {
        "constant_request_rate": {
            "executor": "constant-arrival-rate",
            "rate": 4000,
            "timeUnit": "1s",
            "duration": "48h",
            "preAllocatedVUs": 150
        }
    },
    "thresholds": {
        "http_req_duration{scenario:constant_request_rate}": [
            {
                "threshold": "p(99)<1000",
                "abortOnFail": true
            }
        ],
        "checks{check:status is 200}": [
            {
                "threshold": "rate>=0.99",
                "abortOnFail": false
            }
        ]
    }
}

Expected behaviour

I expect the memory to not keep increasing since the metrics should be flushed every minute.

Actual behaviour

image

@codebien
Copy link
Contributor

Hi @schurmann,
can you check you're not hitting the issue reported here regarding high cardinality, please? #2762 (comment)

Does your request return a big response body?

@schurmann
Copy link
Author

The target is a single static URL and there is no response body. In this example I don't use any tags either so the cardinality is minimal.

@codebien
Copy link
Contributor

Thanks for sharing, then the high usage should be related to the usage of the Trend stats with Prometheus remote write output.

The Trend stats under the hood are using the same concepts as the Summary, so they are affected by the issue of the comment you've reported. Prometheus remote write output doesn't take into account --no-summary --no-thresholds options so it will allocate a lot of memory for long-running tests.

Do you have a specific requirement for using the Trend stats instead of the Native histograms? It would reduce a lot of the memory usage for long-running test using the Prometheus remote write output. Of course, you've to keep using the --no-summary --no-thresholds options.

@codebien codebien added awaiting user waiting for user to respond and removed bug labels Aug 15, 2023
@schurmann
Copy link
Author

Thanks, I tried using native histograms and it was indeed much better regarding the memory usage. I will close this issue. 🙂

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting user waiting for user to respond
Projects
None yet
Development

No branches or pull requests

3 participants