Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Direct Buffers maybe not released for certain HTTP/2 clients #12661

Open
jrauschenbusch opened this issue Dec 20, 2024 · 0 comments
Open

Direct Buffers maybe not released for certain HTTP/2 clients #12661

jrauschenbusch opened this issue Dec 20, 2024 · 0 comments
Labels
Bug For general bugs on Jetty side

Comments

@jrauschenbusch
Copy link

jrauschenbusch commented Dec 20, 2024

Jetty version(s)

Jetty v12.0.16

Jetty Environment

  • Jetty embedded in Scala application
  • Libraries in use:
    • org.eclipse.jetty:jetty-server:12.0.16
    • org.eclipse.jetty.ee10:jetty-ee10-servlet:12.0.16
    • org.eclipse.jetty.http2:jetty-http2-server:12.0.16

Java version/vendor (use: java -version)

openjdk version "25-ea" 2025-09-16
OpenJDK Runtime Environment (build 25-ea+2-135)
OpenJDK 64-Bit Server VM (build 25-ea+2-135, mixed mode, sharing)

OS type/version

  • Docker Image: openjdk:25-slim-bullseye
  • Debian GNU/Linux 11 (bullseye)
  • Kubernetes Container w/ QoS Guaranteed (4 CPU, 10 GiB Memory)

Description

When using Mittens (https://github.com/ExpediaGroup/mittens) as warmup tool running inside a Kubernetes Pod (configured as sidecar) which performs a high volume of concurrent HTTP/2 requests, it seems that Direct Buffers are not be released properly by Jetty (or the underlying OS).

After a short time the Kubernetes container exists with status 137 OOMKilled. Analyzing the Heap and non-heap memory was not bringing any insights. After doing a deeper analysis with a bunch of tools (JFR, Eclipse MAT, Native Memory Tracking, ...) it was indicating Native memory allocations of type=Other were the reason for the OOMKill. This kind of memory allocations increased all the time, but never decreased. After some time i stumbled over the Direct Buffer settings of Jetty. I also tried to find out more details by using the ByteBufferPool.Tracking#dump() output, but from this point of view it was not indicating that there are bigger issues.

Then i tried to disable direct InputDirectBuffers and the problem was gone with same configuration for the Mittens warmup.

A load test test with with 7k req/s and direct buffers enabled (this time no mittens warmup sidecar was in place) was not leading to an OOMKill. This time an Envoy proxy was in front of the Java application. Load test tool was a custom implementation in NodeJS.

The issue did not occurred when using Jetty 11.0.24 also configured to use direct buffers for input and output. Something has been changed as it seems underneath regarding the buffer handling.

During my tests I was able to make the following observations:

  • Using ZGC instead of G1 makes the problem even worse. OOMKill comes much faster than with G1.
  • Using -XX:MaxDirectMemorySize=4g was stabilizing the Java app, but Jetty did not accepted all requests anymore and rejected requests which led to EOFs in the Mittens warmup tool
  • Using a new ByteBufferPool.NonPooling() pool was better than using the ArrayByteBufferPool one (OOMKill was coming later)

How to reproduce?

Create an application with following configuration:

val server = new Server(threadPool)
val httpConfig = new HttpConfiguration()
//httpConfig.setUseInputDirectByteBuffers(false)
//httpConfig.setUseOutputDirectByteBuffers(false)
val connector = new ServerConnector(server, new HttpConnectionFactory(httpConfig), new HTTP2CServerConnectionFactory(httpConfig))
connector.setHost("0.0.0.0")
connector.setPort(8080)
server.addConnector(connector)
val servletContextHandler = new ServletContextHandler()
servletContextHandler.addServlet(new ServletHolder(new DataServlet), "/postData")
   servletContextHandler.addServlet(new ServletHolder(new HealthzServlet), "/healthz")
server.setHandler(servletContextHandler)
server.start()
class HealtzServlet extends HttpServlet {
  override def doGet(req: HttpServletRequest, resp: HttpServletResponse): Unit = {
    // empty
  }
}
class DataServlet extends HttpServlet {
  override def doPost(req: HttpServletRequest, resp: HttpServletResponse): Unit = {
    // empty
  }
}
docker run mittens:latest \
       --concurrency=1000 \
       --concurrency-target-seconds=100 \ 
       --max-duration-seconds=600  \ 
       --max-warmup-seconds=180  \ 
       --max-readiness-wait-seconds=240  \ 
       --target-readiness-http-path=/healthz  \ 
       --target-http-protocol=h2c  \ 
       --http-requests=post:/postData:file:/tmp/warmup.json  \ 
       --http-requests-compression=gzip  \ 
       --http-headers=content-type:application/json  \ 
       -fail-readiness=true

Example of warmup.json structure. Of course filled with content.

{
  "object1": {
     ...
   },
  "object2": {
    ...
  },
  "object3": {
     ...
   },
   "array": [{
       ...
       object4: {
         ...
       }
    }],
}
@jrauschenbusch jrauschenbusch added the Bug For general bugs on Jetty side label Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug For general bugs on Jetty side
Projects
None yet
Development

No branches or pull requests

1 participant