Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

coraza-proxy-wasm support #454

Closed
ivanitskiy opened this issue Nov 30, 2023 · 11 comments
Closed

coraza-proxy-wasm support #454

ivanitskiy opened this issue Nov 30, 2023 · 11 comments

Comments

@ivanitskiy
Copy link

ivanitskiy commented Nov 30, 2023

I was trying to see if https://github.com/corazawaf/coraza-proxy-wasm would work with this module.

a wasm module can be downloaded from: https://github.com/corazawaf/coraza-proxy-wasm/releases/download/0.4.0/coraza-proxy-wasm-0.4.0.zip

error.log file:

2023/11/30 12:10:01 [error] 63932#0: *1 [wasm] error while executing at wasm backtrace:
    0: 0x1db40d - <unknown>!(*github.com/corazawaf/coraza-proxy-wasm/wasmplugin.wafMetrics).incrementCounter
    1: 0x1d98ea - <unknown>!proxy_on_request_headers
note: using the `WASMTIME_BACKTRACE_DETAILS=1` environment variable may show more debugging information

Caused by:
    host trap (function not yet implemented): proxy_define_metric <module: "coraza", vm: "main", runtime: "wasmtime">

Creating this issue to track completeness and compatibility with a more complicated proxy-wasm filter such as coraza-proxy-wasm.

nginx config:

daemon off;
worker_processes  auto;
master_process    off;

events {
    worker_connections  2048;
}

wasm {
    module coraza coraza-proxy-wasm.wasm;
}

http {
    server {
        listen 9000;

        location / {
            proxy_wasm coraza;
            proxy_pass http://127.0.0.1:8000/;
        }
    }

    server {
        listen 8000;
        location / {
            return 200 "Hello, World!";
        }
    }
}

sending a request to nginx:

curl -v localhost:9000
*   Trying 127.0.0.1:9000...
* Connected to localhost (127.0.0.1) port 9000 (#0)
> GET / HTTP/1.1
> Host: localhost:9000
> User-Agent: curl/8.1.2
> Accept: */*
> 
< HTTP/1.1 500 Internal Server Error
< Content-Type: text/html
< Content-Length: 177
< Connection: close
< Server: nginx/1.25.3
< Date: Thu, 30 Nov 2023 20:10:01 GMT
< 
<html>
<head><title>500 Internal Server Error</title></head>
<body>
<center><h1>500 Internal Server Error</h1></center>
<hr><center>nginx/1.25.3</center>
</body>
</html>
* Closing connection 0
@ivanitskiy
Copy link
Author

as this module fails with function not yet implemented: proxy_define_metric and ngx_wasm_module has no yet support for metrics, maybe it makes sense to have a dummy "do nothing" host implementation of such metric-related calls, so this allows wasm modules to work partially?

@ivanitskiy
Copy link
Author

adding debug level for error log just in case:

2023/11/30 12:52:27 [debug] 74030#0: bind() 0.0.0.0:9000 #6 
2023/11/30 12:52:27 [debug] 74030#0: bind() 0.0.0.0:8000 #7 
2023/11/30 12:52:27 [info] 74030#0: [wasm] initializing "main" wasm VM <vm: "main", runtime: "wasmtime">
2023/11/30 12:52:27 [debug] 74030#0: wasm initializing "main" vm engine (engine: 00007FAC4D014320)
2023/11/30 12:52:27 [info] 74030#0: [wasm] using wasmtime with compiler: "auto" (backtraces: 0) <vm: "main", runtime: "wasmtime">
2023/11/30 12:52:27 [debug] 74030#0: wasm loading "coraza" module bytes from "/Users/ivanitskiy/go/src/github.com/kong/ngx_wasm_module/coraza-proxy-wasm.wasm" (module: 00007FAC4D014850, engine: 00007FAC4D014320)
2023/11/30 12:52:27 [debug] 74030#0: malloc: 00007FAC4F800000:10487039
2023/11/30 12:52:27 [debug] 74030#0: read: 8, 00007FAC4F800000, 10487039, 0
2023/11/30 12:52:27 [info] 74030#0: [wasm] "main" wasm VM initialized <vm: "main", runtime: "wasmtime">
2023/11/30 12:52:27 [debug] 74030#0: wasm initializing tls
2023/11/30 12:52:27 [notice] 74030#0: using the "kqueue" event method
2023/11/30 12:52:27 [notice] 74030#0: nginx/1.25.3 (ngx_wasm_module [dev debug wasmtime])
2023/11/30 12:52:27 [notice] 74030#0: built by clang 15.0.0 (clang-1500.0.40.1)
2023/11/30 12:52:27 [notice] 74030#0: OS: Darwin 22.6.0
2023/11/30 12:52:27 [notice] 74030#0: hw.ncpu: 12
2023/11/30 12:52:27 [notice] 74030#0: net.inet.tcp.sendspace: 131072
2023/11/30 12:52:27 [notice] 74030#0: kern.ipc.somaxconn: 128
2023/11/30 12:52:27 [notice] 74030#0: getrlimit(RLIMIT_NOFILE): 8192:9223372036854775807
2023/11/30 12:52:27 [debug] 74030#0: write: 8, 00007FF7B54D6820, 6, 0
2023/11/30 12:52:27 [debug] 74030#0: add cleanup: 00007FAC4D038AB0
2023/11/30 12:52:27 [debug] 74030#0: malloc: 00007FAC4DA43520:8
2023/11/30 12:52:27 [debug] 74030#0: wasm initializing "main" vm engine (engine: 00007FAC4D014320)
2023/11/30 12:52:27 [info] 74030#0: [wasm] using wasmtime with compiler: "auto" (backtraces: 0) <vm: "main", runtime: "wasmtime">
2023/11/30 12:52:27 [info] 74030#0: [wasm] loading "coraza" module <vm: "main", runtime: "wasmtime">


2023/11/30 12:52:27 [info] 74030#0: [wasm] successfully loaded "coraza" module in 2921ms <vm: "main", runtime: "wasmtime">
2023/11/30 12:52:27 [debug] 74030#0: malloc: 00007FAC4D032600:16384
2023/11/30 12:52:27 [debug] 74030#0: malloc: 00007FAC4D442000:16384
2023/11/30 12:52:27 [debug] 74030#0: malloc: 00007FAC4FBA6000:475136
2023/11/30 12:52:27 [debug] 74030#0: malloc: 00007FAC4FDEC000:212992
2023/11/30 12:52:27 [debug] 74030#0: malloc: 00007FAC4F832000:212992
2023/11/30 12:52:27 [debug] 74030#0: kevent set event: 6: ft:-1 fl:0005
2023/11/30 12:52:27 [debug] 74030#0: kevent set event: 7: ft:-1 fl:0005
2023/11/30 12:52:27 [debug] 74030#0: wasm loading plan: 00007FAC4C80BE18
2023/11/30 12:52:27 [debug] 74030#0: wasm loading plan: 00007FAC4D02A1C8
2023/11/30 12:52:27 [debug] 74030#0: wasm linking "coraza" module to "ngx_proxy_wasm" host interface (vm: 00007FAC4D014280, module: 00007FAC4D0147A0, host: 000000010ABB1FD0)
2023/11/30 12:52:27 [debug] 74030#0: wasm "coraza" module successfully linked to "ngx_proxy_wasm" host interface (vm: 00007FAC4D014280, module: 00007FAC4D0147A0, host: 000000010ABB1FD0)
2023/11/30 12:52:27 [debug] 74030#0: proxy_wasm initializing "coraza" filter (config size: 0, filter: 00007FAC4D02A850, filter->id: 3810127747)
2023/11/30 12:52:27 [debug] 74030#0: wasm creating "coraza" instance in "main" vm
2023/11/30 12:52:27 [debug] 74030#0: posix_memalign: 00007FAC4E36B800:16384 @16
2023/11/30 12:52:30 [debug] 74030#0: [proxy-wasm] "coraza" filter new instance (ictx: 00007FAC4D039BA8, store: 00007FAC4C80A5E0)
2023/11/30 12:52:30 [debug] 74030#0: worker cycle
2023/11/30 12:52:30 [debug] 74030#0: kevent timer: -1, changes: 2


2023/11/30 12:52:39 [debug] 74030#0: kevent events: 1
2023/11/30 12:52:39 [debug] 74030#0: kevent: 6: ft:-1 fl:0005 ff:00000000 d:1 ud:00007FAC4FDEC000
2023/11/30 12:52:39 [debug] 74030#0: accept on 0.0.0.0:9000, ready: 1
2023/11/30 12:52:39 [debug] 74030#0: posix_memalign: 00007FAC55F04C40:512 @16
2023/11/30 12:52:39 [debug] 74030#0: *1 accept: 127.0.0.1:56772 fd:8
2023/11/30 12:52:39 [debug] 74030#0: *1 event timer add: 8: 60000:1885459735
2023/11/30 12:52:39 [debug] 74030#0: *1 reusable connection: 1
2023/11/30 12:52:39 [debug] 74030#0: *1 kevent set event: 8: ft:-1 fl:0025
2023/11/30 12:52:39 [debug] 74030#0: timer delta: 8691
2023/11/30 12:52:39 [debug] 74030#0: worker cycle
2023/11/30 12:52:39 [debug] 74030#0: kevent timer: 60000, changes: 1
2023/11/30 12:52:39 [debug] 74030#0: kevent events: 1
2023/11/30 12:52:39 [debug] 74030#0: kevent: 8: ft:-1 fl:0025 ff:00000000 d:77 ud:00007FAC4FDEC0D0
2023/11/30 12:52:39 [debug] 74030#0: *1 http wait request handler
2023/11/30 12:52:39 [debug] 74030#0: *1 malloc: 00007FAC4D036600:1024
2023/11/30 12:52:39 [debug] 74030#0: *1 recv: eof:0, avail:77, err:0
2023/11/30 12:52:39 [debug] 74030#0: *1 recv: fd:8 77 of 1024
2023/11/30 12:52:39 [debug] 74030#0: *1 reusable connection: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 posix_memalign: 00007FAC4D00DC00:4096 @16
2023/11/30 12:52:39 [debug] 74030#0: *1 http process request line
2023/11/30 12:52:39 [debug] 74030#0: *1 http request line: "GET / HTTP/1.1"
2023/11/30 12:52:39 [debug] 74030#0: *1 http uri: "/"
2023/11/30 12:52:39 [debug] 74030#0: *1 http args: ""
2023/11/30 12:52:39 [debug] 74030#0: *1 http exten: ""
2023/11/30 12:52:39 [debug] 74030#0: *1 posix_memalign: 00007FAC54010200:4096 @16
2023/11/30 12:52:39 [debug] 74030#0: *1 http process request header line
2023/11/30 12:52:39 [debug] 74030#0: *1 http header: "Host: localhost:9000"
2023/11/30 12:52:39 [debug] 74030#0: *1 http header: "User-Agent: curl/8.1.2"
2023/11/30 12:52:39 [debug] 74030#0: *1 http header: "Accept: */*"
2023/11/30 12:52:39 [debug] 74030#0: *1 http header done
2023/11/30 12:52:39 [debug] 74030#0: *1 event timer del: 8: 1885459735
2023/11/30 12:52:39 [debug] 74030#0: *1 rewrite phase: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 test location: "/"
2023/11/30 12:52:39 [debug] 74030#0: *1 using configuration "/"
2023/11/30 12:52:39 [debug] 74030#0: *1 http cl:-1 max:1048576
2023/11/30 12:52:39 [debug] 74030#0: *1 rewrite phase: 2
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm rctx created: 00007FAC4D00E920 (r: 00007FAC4D00DC50, main: 1, fake: 0)
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm attaching plan to rctx (rctx: 00007FAC4D00E920, plan: 00007FAC4D02A1C8)
2023/11/30 12:52:39 [debug] 74030#0: *1 add cleanup: 00007FAC4D00EA70
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm rctx injecting ngx_http_wasm_content_handler
2023/11/30 12:52:39 [debug] 74030#0: *1 malloc: 00007FAC4C716620:464
2023/11/30 12:52:39 [debug] 74030#0: *1 posix_memalign: 00007FAC4C716F00:512 @16
2023/11/30 12:52:39 [debug] 74030#0: *1 proxy_wasm initializing filter chain (nfilters: 1, isolation: 1)
2023/11/30 12:52:39 [debug] 74030#0: *1 [proxy-wasm] "coraza" filter reusing instance (ictx: 00007FAC4D039BA8, store: 00007FAC4C80A5E0)
2023/11/30 12:52:39 [debug] 74030#0: *1 [proxy-wasm]["coraza" #1] filter 1/1 resuming "on_request_headers" step in "rewrite" phase
2023/11/30 12:52:39 [error] 74030#0: *1 [wasm] error while executing at wasm backtrace:
    0: 0x1db40d - <unknown>!(*github.com/corazawaf/coraza-proxy-wasm/wasmplugin.wafMetrics).incrementCounter
    1: 0x1d98ea - <unknown>!proxy_on_request_headers
note: using the `WASMTIME_BACKTRACE_DETAILS=1` environment variable may show more debugging information

Caused by:
    host trap (function not yet implemented): proxy_define_metric <module: "coraza", vm: "main", runtime: "wasmtime">
2023/11/30 12:52:39 [debug] 74030#0: wasm "rewrite" phase rc: 500
2023/11/30 12:52:39 [debug] 74030#0: *1 http finalize request: 500, "/?" a:1, c:1
2023/11/30 12:52:39 [debug] 74030#0: *1 http special response: 500, "/?"
2023/11/30 12:52:39 [debug] 74030#0: *1 http set discard body
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm rctx reused: 00007FAC4D00E920 (r: 00007FAC4D00DC50, main: 1)
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm producing default response headers
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm setting response header: "Server: nginx/1.25.3"
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm setting response header: "Date: Thu, 30 Nov 2023 20:52:39 GMT"
2023/11/30 12:52:39 [debug] 74030#0: *1 [proxy-wasm] resetting filter chain: pwctx->exec_index 0 to 0 (pwctx: 00007FAC4C716620)
2023/11/30 12:52:39 [info] 74030#0: *1 [proxy-wasm] filter chain failed resuming: previous error (instance trapped), client: 127.0.0.1, server: , request: "GET / HTTP/1.1", host: "localhost:9000"
2023/11/30 12:52:39 [debug] 74030#0: *1 HTTP/1.1 500 Internal Server Error
Content-Type: text/html
Content-Length: 177
Connection: close
Server: nginx/1.25.3
Date: Thu, 30 Nov 2023 20:52:39 GMT

2023/11/30 12:52:39 [debug] 74030#0: *1 write new buf t:1 f:0 00007FAC4D00EAE8, pos 00007FAC4D00EAE8, size: 162 file: 0, size: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 http write filter: l:0 f:0 s:162
2023/11/30 12:52:39 [debug] 74030#0: *1 http output filter "/?"
2023/11/30 12:52:39 [debug] 74030#0: *1 http copy filter: "/?"
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm rctx reused: 00007FAC4D00E920 (r: 00007FAC4D00DC50, main: 1)
2023/11/30 12:52:39 [debug] 74030#0: *1 [proxy-wasm] filter 1/1 skipping "on_response_body" step in "body_filter" phase (instance trapped)
2023/11/30 12:52:39 [debug] 74030#0: *1 http postpone filter "/?" 00007FAC54010738
2023/11/30 12:52:39 [debug] 74030#0: *1 write old buf t:1 f:0 00007FAC4D00EAE8, pos 00007FAC4D00EAE8, size: 162 file: 0, size: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 write new buf t:0 f:0 0000000000000000, pos 000000010AB9D070, size: 124 file: 0, size: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 write new buf t:0 f:0 0000000000000000, pos 000000010AB9BCB0, size: 53 file: 0, size: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 http write filter: l:1 f:0 s:339
2023/11/30 12:52:39 [debug] 74030#0: *1 http write filter limit 2097152
2023/11/30 12:52:39 [debug] 74030#0: *1 writev: 339 of 339
2023/11/30 12:52:39 [debug] 74030#0: *1 http write filter 0000000000000000
2023/11/30 12:52:39 [debug] 74030#0: *1 http copy filter: 0 "/?"
2023/11/30 12:52:39 [debug] 74030#0: *1 http finalize request: 0, "/?" a:1, c:1
2023/11/30 12:52:39 [debug] 74030#0: *1 http request count:1 blk:0
2023/11/30 12:52:39 [debug] 74030#0: *1 http close request
2023/11/30 12:52:39 [debug] 74030#0: *1 http log handler
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm rctx reused: 00007FAC4D00E920 (r: 00007FAC4D00DC50, main: 1)
2023/11/30 12:52:39 [debug] 74030#0: *1 [proxy-wasm] filter 1/1 skipping "on_log" step in "log" phase (instance trapped)
2023/11/30 12:52:39 [debug] 74030#0: *1 run cleanup: 00007FAC4D00EA70
2023/11/30 12:52:39 [debug] 74030#0: *1 wasm cleaning up request pool (stream id: 1, r: 00007FAC4D00DC50, main: 1)
2023/11/30 12:52:39 [debug] 74030#0: *1 [proxy-wasm] filter 1/1 skipping "on_done" step in "done" phase (instance trapped)
2023/11/30 12:52:39 [debug] 74030#0: *1 [proxy-wasm] "coraza" filter freeing context #1 (1/1)
2023/11/30 12:52:39 [debug] 74030#0: *1 free: 00007FAC4C716F00, unused: 154
2023/11/30 12:52:39 [debug] 74030#0: *1 free: 00007FAC4C716620
2023/11/30 12:52:39 [debug] 74030#0: *1 free: 00007FAC4D00DC00, unused: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 free: 00007FAC54010200, unused: 2577
2023/11/30 12:52:39 [debug] 74030#0: *1 close http connection: 8
2023/11/30 12:52:39 [debug] 74030#0: *1 reusable connection: 0
2023/11/30 12:52:39 [debug] 74030#0: *1 free: 0000000000000000
2023/11/30 12:52:39 [debug] 74030#0: *1 free: 00007FAC4D036600
2023/11/30 12:52:39 [debug] 74030#0: *1 free: 00007FAC55F04C40, unused: 120
2023/11/30 12:52:39 [debug] 74030#0: timer delta: 0
2023/11/30 12:52:39 [debug] 74030#0: worker cycle
2023/11/30 12:52:39 [debug] 74030#0: kevent timer: -1, changes: 0

@jcchavezs
Copy link

You can try the build using corazawaf/coraza-proxy-wasm#144 and maybe make it work the example.

@thibaultcha
Copy link
Member

Hi,

With the proxy_define_metric "fake host function" on top of the feat/response-body-buffering branch (#381), it should work (it did work locally for me).
The response body buffering is something we'll be merging soon, but for the proxy_define_metric thing I am on the fence... I think that host implementations should not "lie" to the user, and should properly report something not being implemented rather than silently ignore it and let the user deal with the consequences later. Despite the noble goal of Proxy-Wasm wanting to be totally "host agnostic", there will always be unavoidable differences between hosts (e.g. Envoy vs. Nginx); imho there should be a way (ideally) for Proxy-Wasm filters to detect the underlying host and behave slightly differently when necessary (e.g. not calling proxy_define_metric if it's not available and optional for the filter to function with a minimal feature-set).
If we go ahead and make proxy_define_metric a NOP (even with some warning log), then what do we do when another NYI host function gets called? We can't log warnings for all of them but keep the filter running with undefined behavior because the underlying feature isn't there at all: those are things that must be worked on and thought-through at the filter level so we never "deceive" the filter user and/or author into thinking something will work when it won't.

@jptosso
Copy link

jptosso commented Dec 4, 2023

~Hey! It works by transforming proxy_define_metric into a NOP, but how can we send options to the WASM filter? ~ Found it https://github.com/Kong/ngx_wasm_module/blob/nightly/docs/DIRECTIVES.md#module

Works with the following config:

# nginx.conf
events {}
error_log /dev/stdout info;

# nginx master process gets a default 'main' VM
# a new top-level configuration block receives all configuration for this main VM
wasm {
    #      [name]    [path.{wasm,wat}]
    module coraza /nginx/coraza.wasm '';
    # module my_module /path/to/module.wasm;
}

# each nginx worker process is able to instantiate wasm modules in its subsystems
http {
    access_log /dev/stdout;
    server {
        listen 8080;

        location / {
            # execute a proxy-wasm filter when proxying
            #           [module]
            proxy_wasm  coraza '
{
    "directives_map": {
        "default": [
            "SecDebugLogLevel 9",
            "SecRuleEngine On",
            "SecRule REQUEST_URI \\"@streq /admin\\" \\"id:101,phase:1,t:lowercase,deny\\""
        ]
    },
    "default_directives": "default"
}';            

            # execute more WebAssembly during the access phase
            #           [phase] [module]  [function]
            # wasm_call   access  my_module check_something;

            return 200 'Goodbye World!';
        }
    }

    # other directives
    wasm_socket_connect_timeout 60s;
    wasm_socket_send_timeout    60s;
    wasm_socket_read_timeout    60s;

    wasm_socket_buffer_size     8k;
    wasm_socket_large_buffers   32 16k;
}

But it fails with error 200 and empty response:

➜  /tmp curl http://localhost:8080/admin -v --output -
*   Trying 127.0.0.1:8080...
* Connected to localhost (127.0.0.1) port 8080 (#0)
> GET /admin HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/8.1.2
> Accept: */*
> 
< HTTP/1.1 200 OK
< Content-Type: text/plain
< Content-Length: 14
< Connection: keep-alive
< Server: nginx/1.25.3
< Date: Mon, 04 Dec 2023 12:29:56 GMT
< 
* Connection #0 to host localhost left intact

Update 3:

If I use proxy_pass instead of return, curl times out:

➜  /tmp curl http://localhost:8080/admin -m 5               
curl: (28) Operation timed out after 5001 milliseconds with 0 bytes received
➜  /tmp 

@thibaultcha
Copy link
Member

Not something I can reproduce; it works fine for me locally.

@jptosso
Copy link

jptosso commented Dec 4, 2023

Not something I can reproduce; it works fine for me locally.

with Coraza proxy wasm? could you share your config?

This is my dockerfile

FROM ubuntu

WORKDIR /tmp

RUN apt update -y
RUN apt install -y git wget unzip
RUN mkdir /nginx
RUN wget https://github.com/Kong/ngx_wasm_module/releases/download/nightly/wasmx-nightly-20231204-wasmtime-aarch64-ubuntu22.04.tar.gz
RUN tar -xvf wasmx-nightly-20231204-wasmtime-aarch64-ubuntu22.04.tar.gz
RUN mv wasmx-nightly-20231204-wasmtime-aarch64-ubuntu22.04/nginx /usr/local/sbin/nginx

I mounted the modified version of coraza-proxy-wasm where I modified the metrics.go file:

func (m *wafMetrics) incrementCounter(fqn string) {
}

@thibaultcha
Copy link
Member

It's probably an issue with your Docker setup imho. I used the exact same config you posted above with a local build of ngx_wasm_module main. I updated proxy_define_metric to return NGX_WAVM_OK, but in practice with corazawaf/coraza-proxy-wasm#144 it should just not be invoked, I just didn't want to build it locally.

@ivanitskiy
Copy link
Author

ivanitskiy commented Dec 4, 2023

A few thoughts from me: It would be beneficial for the host to support indicating the implemented and supported ABI version (akin to capabilities). This information can be utilized by the guest to dynamically determine the type of host it is interacting with. An analogous example is JavaScript code determining whether it is running in a node.js or browser environment.

Additionally, the absence of a metrics-related API may not be a critical issue for WAF module (though it could potentially affect reporting). However, having such an API is valuable for production. It's unlikely that someone would deploy production workloads without metrics. Ultimately, the decision of pros and cons lies with the guest developer. As for testing and integrations, it may be okay to go without metrics and enable determining corectness and performance of the module on the particular host.

As an illustration, the RUST SDK includes this feature:
proxy_abi_version_0_2_1: RUST SDK Link

Similarly, the CPP (envoy) host also provides this capability: CPP Host Link

I know, that host type/kind if not part of ABI spec, just a thought.

@thibaultcha
Copy link
Member

thibaultcha commented Dec 4, 2023

A few thoughts from me: It would be beneficial for the host to support indicating the implemented and supported ABI version (akin to capabilities). This information can be utilized by the guest to dynamically determine the type of host it is interacting with. An analogous example is JavaScript code determining whether it is running in a node.js or browser environment.

Agreed and that is what I was hinting at, but there is no such API in the Proxy-Wasm SDK at the moment as you know, so users have to rely on their knowledge of the underlying host, e.g. like corazawaf/coraza-proxy-wasm#144 does.

We will of course add metrics support but we do not have a timeline for it at the moment.

@thibaultcha
Copy link
Member

I'd be inclined to close this given the filter behaves exactly as expected under the current state of the module. Unless a valid objection I will close it soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants