Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streamed Responses aren't Coming Through Until Finalized #1120

Open
svenakela opened this issue Aug 2, 2024 · 9 comments
Open

Streamed Responses aren't Coming Through Until Finalized #1120

svenakela opened this issue Aug 2, 2024 · 9 comments

Comments

@svenakela
Copy link

Description

NextJs uses streams to send data asynchronously to web clients.

When a NextJs application is deployed behind a Coraza WAF extended Caddy server, streamed responses are hindered by the SecRuleEngine. If there for example is a spinner that NextJs wants the browser to render until data is fetched in the background, it will not show up and the browser will be dead silent until the data is coming through.

I've tried to disable everything body access, I've tried to write a rule that allows any response. Nothing helps except disabling SecRuleEngine completely.

Steps to reproduce

We've setup an example repo that reproduces the problem. The project consists of a small NextJs application behind a Caddy server. The issue is appearing by default in this repo.

https://github.com/svenakela/coraza-streaming-issue

Disable SecRuleEngine and the app works as expected.

Expected result

In this example project the bottom text should change to Loading... until data is sent by the app server.

Actual result

No textual change at all. The bottom text only change when the response is ready in the app server.

@RedXanadu
Copy link

RedXanadu commented Aug 5, 2024

Wow! This is perhaps the most fully documented, step-by-step issue I've ever seen on GitHub. Thank you @svenakela!

I was easily able to reproduce your described issue, thanks to your instructions and repo with the pre-prepared docker-compose file (I doubt I would have had the time to look at this issue, otherwise.)

I made the following observations:

  1. Using ModSecurity v3 + Nginx seems to work as intended. The behaviour from going via the ModSec+Nginx proxy seems to be the same as going directly to the nextjs-spinner container: "Loading..." is displayed.
  2. Using ModSecurity v2 + Apache results in the same behaviour as Coraza + Caddy: the ModSec+Apache proxy seems to wait until it has all chunks of the response before sending a single byte back to the client.
  3. With ModSecurity v2 + Apache, setting SecResponseBodyAccess to Off makes things work as expected: now that the proxy isn't interested in the response data it sends the chunks to the client as soon as they're available and "Loading..." is displayed.
  4. With Coraza + Caddy, setting SecResponseBodyAccess to Off makes no difference. Either this setting isn't being respected or Coraza and/or Caddy will always wait for a full response before sending anything back to the client. I'm not sure which is the case, but I would guess maybe the latter.

I hope this helps drive your issue forwards!

@jcchavezs
Copy link
Member

According to https://github.com/corazawaf/coraza-caddy/blob/c2e0fbdc9c648550df8b2466ee5bf86bebbf2494/interceptor.go#L89-L104 if the response body isn't accessible nor processable we skip buffering. Would you mind trying with https://github.com/jcchavezs/coraza-httpbin @RedXanadu ?

@svenakela
Copy link
Author

Wow! This is perhaps the most fully documented, step-by-step issue I've ever seen on GitHub. Thank you @svenakela!
...
I made the following observations:
....
4. With Coraza + Caddy, setting SecResponseBodyAccess to Off makes no difference. Either this setting isn't being respected or Coraza and/or Caddy will always wait for a full response before sending anything back to the client. I'm not sure which is the case, but I would guess maybe the latter.

Thanks for the kind words.
4. Yes, that was my observation too, the SecResponseBodyAccess made no difference at all. Disabling the SecRuleEngine makes the issue go away, and a clean Caddy installation is behaving as expected. I would think that Caddy is innocent.

@svenakela
Copy link
Author

According to https://github.com/corazawaf/coraza-caddy/blob/c2e0fbdc9c648550df8b2466ee5bf86bebbf2494/interceptor.go#L89-L104 if the response body isn't accessible nor processable we skip buffering. Would you mind trying with https://github.com/jcchavezs/coraza-httpbin @RedXanadu ?

I haven't checked the code, but is there a guarantee that the tx settings are set from the config?

@svenakela
Copy link
Author

How can I help to proceed with this lil' bug of ours?

@TAR5
Copy link

TAR5 commented Aug 17, 2024

I can confirm that this does not seem to be related to Caddy.

Here's a minimal example to reproduce the issue:

package main

import (
	"github.com/corazawaf/coraza/v3"
	txhttp "github.com/corazawaf/coraza/v3/http"
	"net/http"
	"time"
)

func main() {
	cfg := coraza.NewWAFConfig().WithDirectivesFromFile("coraza.conf") // SecRuleEngine Off = unbuffered, SecRuleEngine On = buffered
	waf, _ := coraza.NewWAF(cfg)
	http.ListenAndServe(":8000", txhttp.WrapHandler(waf, http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
		flusher, _ := w.(http.Flusher)
		w.WriteHeader(http.StatusOK)
		w.Write([]byte("Hello "))
		flusher.Flush()
		time.Sleep(5 * time.Second)
		w.Write([]byte("world!"))
	})))
}

@TAR5
Copy link

TAR5 commented Aug 17, 2024

It seems that the ResponseWriter is always wrapped, and the "response processor" outputs the buffered response?
https://github.com/corazawaf/coraza/blob/main/http/interceptor.go#L140-L147
https://github.com/corazawaf/coraza/blob/main/http/middleware.go#L163-L171
https://github.com/corazawaf/coraza-caddy/blob/main/coraza.go#L133-L140

Is it really necessary to wrap the ResponseWriter, If tx.IsResponseBodyAccessible() && tx.IsResponseBodyProcessable() is false?

@agclark27
Copy link

We're interested in disabling response buffering when SecResponseBodyAccess Off is set, too, and there is a related issue in the coraza-caddy project.

@svenakela
Copy link
Author

What do we think about this one, are we getting into any conclusions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants