-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Event submission rejected by django CSRF: Forbidden (Referer checking failed - no Referer.) #3312
Comments
On your self-hosted/docker-compose.yml Line 455 in b8b4aa2
That should probably fix it. But I see your CSRF_TRUSTED_ORIGINS = ["*", "https://*sentry.mydomain.com", "https://*", "http://*"] I'm not sure whether Django will accept that. If I were you, I'll just do this: CSRF_TRUSTED_ORIGINS = ["http://sentry.mydomain.com", "https://sentry.mydomain.com"] |
Thank you for your response. I have updated the {
debug
}
{$DOMAIN} {
log {
output file /data/access.log
format json
}
reverse_proxy nginx:80 {
health_uri /_health/
health_status 2xx
header_up Host {upstream_hostport}
}
# By default, the TLS is acquired from Let's Encrypt
tls [email protected]
} By checking the docker containers logs it seems that the requests now go through caddy and also nginx and relay until they reach the sentry instance ( web-1 | 09:30:29 [WARNING] django.security.csrf: Forbidden (CSRF cookie not set.): /api/3/envelope/ (status_code=403 request=<WSGIRequest: POST '/api/3/envelope/'>)
web-1 | 09:30:29 [INFO] sentry.access.api: api.access (method='POST' view='django.views.generic.base.RedirectView' response=403 user_id='None' is_app='None' token_type='None' is_frontend_request='False' organization_id='None' auth_id='None' path='/api/3/envelope/' caller_ip='****' user_agent='sentry.python/2.13.0' rate_limited='False' rate_limit_category='None' request_duration_seconds=0.016865015029907227 rate_limit_type='DNE' concurrent_limit='None' concurrent_requests='None' reset_time='None' group='None' limit='None' remaining='None') |
@razvan286 after changing your |
@aldy505 yes, I always run the |
Oh I just realized this message
The Can you confirm from your Caddy instance that it really did a reverse proxy to the To sum up (the name of the container):
Then data ingested by |
@aldy505 Here is the full container output from when the request reaches caddy first and in the end the sentry instance that returns the caddy | {"level":"debug","ts":1725955821.389444,"logger":"events","msg":"event","name":"tls_get_certificate","id":"fcf0f3b8-9a92-4ca3-8ae7-94fa1e16f8c7","origin":"tls","data":{"client_hello":{"CipherSuites":[4866,4867,4865,49196,49200,49195,49199,52393,52392,49188,49192,49187,49191,159,158,107,103,255],"ServerName":"sentry.mydomain.com","SupportedCurves":[29,23,30,25,24,256,257,258,259,260],"SupportedPoints":"AAEC","SignatureSchemes":[1027,1283,1539,2055,2056,2057,2058,2059,2052,2053,2054,1025,1281,1537,771,769,770,1026,1282,1538],"SupportedProtos":["http/1.1"],"SupportedVersions":[772,771],"RemoteAddr":{"IP":"****","Port":37110,"Zone":""},"LocalAddr":{"IP":"*****","Port":443,"Zone":""}}}}
caddy | {"level":"debug","ts":1725955821.3895462,"logger":"tls.handshake","msg":"choosing certificate","identifier":"sentry.mydomain.com","num_choices":1}
caddy | {"level":"debug","ts":1725955821.3895824,"logger":"tls.handshake","msg":"default certificate selection results","identifier":"sentry.mydomain.com","subjects":["sentry.mydomain.com"],"managed":true,"issuer_key":"acme-v02.api.letsencrypt.org-directory","hash":"eae69292c5535321b6dc540f182c76d910120f4a074a11bf66600357806d2d1b"}
caddy | {"level":"debug","ts":1725955821.389603,"logger":"tls.handshake","msg":"matched certificate in cache","remote_ip":"*****","remote_port":"37110","subjects":["sentry.mydomain.com"],"managed":true,"expiration":1733305272,"hash":"eae69292c5535321b6dc540f182c76d910120f4a074a11bf66600357806d2d1b"}
caddy | {"level":"debug","ts":1725955821.4079857,"logger":"http.handlers.reverse_proxy","msg":"selected upstream","dial":"nginx:80","total_upstreams":1}
relay-1 | 2024-09-10T08:10:21.409512Z DEBUG request{method=POST uri=/api/3/envelope/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request
relay-1 | 2024-09-10T08:10:21.410400Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-10T08:10:21.410441Z DEBUG relay_server::services::project: project 6c77219228f5973a3f9efc00978f35ae state requested 1 times
relay-1 | 2024-09-10T08:10:21.410558Z TRACE request{method=POST uri=/api/3/envelope/ version=HTTP/1.1}: relay_server::endpoints::common: Sending envelope to project cache for V1 buffer
relay-1 | 2024-09-10T08:10:21.410748Z DEBUG request{method=POST uri=/api/3/envelope/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=1 ms status=200
relay-1 | 2024-09-10T08:10:21.411020Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-10T08:10:21.411044Z TRACE relay_server::services::project_cache: Enqueueing envelope
nginx-1 | *** - - [10/Sep/2024:08:10:21 +0000] "POST /api/3/envelope/ HTTP/1.1" 200 41 "-" "sentry.python/2.13.0" "***"
caddy | {"level":"debug","ts":1725955821.41155,"logger":"http.handlers.reverse_proxy","msg":"upstream roundtrip","upstream":"nginx:80","duration":0.003337724,"request":{"remote_ip":"***","remote_port":"37110","client_ip":"***","proto":"HTTP/1.1","method":"POST","host":"nginx:80","uri":"/api/3/envelope/","headers":{"Content-Type":["application/x-sentry-envelope"],"X-Forwarded-Proto":["https"],"Accept-Encoding":["identity"],"Content-Length":["1079"],"X-Forwarded-Host":["sentry.mydomain.com"],"Content-Encoding":["gzip"],"User-Agent":["sentry.python/2.13.0"],"X-Sentry-Auth":["Sentry sentry_key=6c77219228f5973a3f9efc00978f35ae, sentry_version=7, sentry_client=sentry.python/2.13.0"],"X-Forwarded-For":["***"]},"tls":{"resumed":false,"version":772,"cipher_suite":4865,"proto":"http/1.1","server_name":"sentry.mydomain.com"}},"headers":{"Connection":["keep-alive"],"Access-Control-Allow-Origin":["*"],"Vary":["origin","access-control-request-method","access-control-request-headers"],"Cross-Origin-Resource-Policy":["cross-origin"],"Server":["nginx"],"Date":["Tue, 10 Sep 2024 08:10:21 GMT"],"Content-Type":["application/json"],"Content-Length":["41"],"Access-Control-Expose-Headers":["x-sentry-error,x-sentry-rate-limits,retry-after"]},"status":200}
relay-1 | 2024-09-10T08:10:21.512620Z DEBUG relay_server::services::project_upstream: updating project states for 1/1 projects (attempt 1)
relay-1 | 2024-09-10T08:10:21.512811Z DEBUG relay_server::services::project_upstream: sending request of size 1
relay-1 | 2024-09-10T08:10:21.539993Z DEBUG relay_server::services::project: project state 6c77219228f5973a3f9efc00978f35ae updated
nginx-1 | 172.18.0.57 - - [10/Sep/2024:08:10:27 +0000] "GET /_health/ HTTP/1.1" 200 12 "-" "Go-http-client/1.1" "-"
caddy | {"level":"info","ts":1725955827.935051,"logger":"http.handlers.reverse_proxy.health_checker.active","msg":"host is up","host":"nginx:80"}
relay-1 | 2024-09-10T08:10:35.557499Z TRACE relay_server::services::metrics::aggregator: flushing 1 partitions to receiver
relay-1 | 2024-09-10T08:10:35.557858Z TRACE relay_server::services::processor: sending envelope to sentry endpoint
relay-1 | 2024-09-10T08:10:35.558493Z TRACE relay_server::services::cogs: recording measurement: CogsMeasurement { resource: Relay, feature: MetricsSessions, value: Time(724.815µs) }
web-1 | 08:10:35 [WARNING] django.security.csrf: Forbidden (CSRF cookie not set.): /api/1/envelope/ (status_code=403 request=<WSGIRequest: POST '/api/1/envelope/'>)
web-1 | 08:10:35 [INFO] sentry.access.api: api.access (method='POST' view='django.views.generic.base.RedirectView' response=403 user_id='None' is_app='None' token_type='None' is_frontend_request='False' organization_id='None' auth_id='None' path='/api/1/envelope/' caller_ip='' user_agent='sentry-relay/24.8.0' rate_limited='False' rate_limit_category='None' request_duration_seconds=0.03174233436584473 rate_limit_type='DNE' concurrent_limit='None' concurrent_requests='None' reset_time='None' group='None' limit='None' remaining='None') |
That looks fine to me, the error you're trying to send from the Python app went through the projectId of I hope this illustration explain the situation better: For now, my suggestion is to modify the self-hosted/docker-compose.yml Lines 43 to 62 in b8b4aa2
to be: environment:
# ...
OPENAI_API_KEY:
SENTRY_DSN: Then on your Re-run |
I added the DSN value of the Logs right after I run the python script: caddy | {"level":"debug","ts":1726043825.853992,"logger":"events","msg":"event","name":"tls_get_certificate","id":"466cbdef-1c4e-45b0-ad5e-c6df8e4e7705","origin":"tls","data":{"client_hello":{"CipherSuites":[4866,4867,4865,49196,49200,49195,49199,52393,52392,49188,49192,49187,49191,159,158,107,103,255],"ServerName":"sentry.mydomain.com","SupportedCurves":[29,23,30,25,24,256,257,258,259,260],"SupportedPoints":"AAEC","SignatureSchemes":[1027,1283,1539,2055,2056,2057,2058,2059,2052,2053,2054,1025,1281,1537,771,769,770,1026,1282,1538],"SupportedProtos":["http/1.1"],"SupportedVersions":[772,771],"RemoteAddr":{"IP":"****","Port":49580,"Zone":""},"LocalAddr":{"IP":"***","Port":443,"Zone":""}}}}
caddy | {"level":"debug","ts":1726043825.8540905,"logger":"tls.handshake","msg":"choosing certificate","identifier":"sentry.mydomain.com","num_choices":1}
caddy | {"level":"debug","ts":1726043825.854142,"logger":"tls.handshake","msg":"default certificate selection results","identifier":"sentry.mydomain.com","subjects":["sentry.mydomain.com"],"managed":true,"issuer_key":"acme-v02.api.letsencrypt.org-directory","hash":"eae69292c5535321b6dc540f182c76d910120f4a074a11bf66600357806d2d1b"}
caddy | {"level":"debug","ts":1726043825.8541822,"logger":"tls.handshake","msg":"matched certificate in cache","remote_ip":"****","remote_port":"49580","subjects":["sentry.mydomain.com"],"managed":true,"expiration":1733305272,"hash":"eae69292c5535321b6dc540f182c76d910120f4a074a11bf66600357806d2d1b"}
caddy | {"level":"debug","ts":1726043825.8728714,"logger":"http.handlers.reverse_proxy","msg":"selected upstream","dial":"nginx:80","total_upstreams":1}
relay-1 | 2024-09-11T08:37:05.874100Z DEBUG request{method=POST uri=/api/3/envelope/ version=HTTP/1.1}: tower_http::trace::on_request: started processing request
relay-1 | 2024-09-11T08:37:05.875026Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-11T08:37:05.875193Z TRACE request{method=POST uri=/api/3/envelope/ version=HTTP/1.1}: relay_server::endpoints::common: Sending envelope to project cache for V1 buffer
relay-1 | 2024-09-11T08:37:05.875284Z DEBUG request{method=POST uri=/api/3/envelope/ version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=1 ms status=200
relay-1 | 2024-09-11T08:37:05.875504Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-11T08:37:05.875532Z TRACE relay_server::services::project_cache: Sending envelope to processor
nginx-1 | **** - - [11/Sep/2024:08:37:05 +0000] "POST /api/3/envelope/ HTTP/1.1" 200 41 "-" "sentry.python/2.13.0" "****"
caddy | {"level":"debug","ts":1726043825.875985,"logger":"http.handlers.reverse_proxy","msg":"upstream roundtrip","upstream":"nginx:80","duration":0.002951075,"request":{"remote_ip":"*****","remote_port":"49580","client_ip":"*****","proto":"HTTP/1.1","method":"POST","host":"nginx:80","uri":"/api/3/envelope/","headers":{"X-Forwarded-Proto":["https"],"Accept-Encoding":["identity"],"X-Forwarded-For":["****"],"User-Agent":["sentry.python/2.13.0"],"Content-Encoding":["gzip"],"X-Sentry-Auth":["Sentry sentry_key=6c77219228f5973a3f9efc00978f35ae, sentry_version=7, sentry_client=sentry.python/2.13.0"],"Content-Length":["1079"],"Content-Type":["application/x-sentry-envelope"],"X-Forwarded-Host":["sentry.mydomain.com"]},"tls":{"resumed":false,"version":772,"cipher_suite":4865,"proto":"http/1.1","server_name":"sentry.mydomain.com"}},"headers":{"Date":["Wed, 11 Sep 2024 08:37:05 GMT"],"Content-Type":["application/json"],"Access-Control-Expose-Headers":["x-sentry-error,x-sentry-rate-limits,retry-after"],"Cross-Origin-Resource-Policy":["cross-origin"],"Server":["nginx"],"Content-Length":["41"],"Connection":["keep-alive"],"Access-Control-Allow-Origin":["*"],"Vary":["origin","access-control-request-method","access-control-request-headers"]},"status":200}
relay-1 | 2024-09-11T08:37:05.875548Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-11T08:37:05.875738Z TRACE relay_server::services::processor: Processing error group
relay-1 | 2024-09-11T08:37:05.875780Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-11T08:37:05.875803Z TRACE relay_server::services::processor::event: processing json event
relay-1 | 2024-09-11T08:37:05.876220Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-11T08:37:05.876267Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-11T08:37:05.876726Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string
relay-1 | 2024-09-11T08:37:05.877681Z TRACE relay_server::services::processor: sending envelope to sentry endpoint
relay-1 | 2024-09-11T08:37:05.879065Z TRACE relay_server::services::cogs: recording measurement: CogsMeasurement { resource: Relay, feature: Errors, value: Time(3.160312ms) }
web-1 | 08:37:05 [WARNING] django.security.csrf: Forbidden (CSRF cookie not set.): /api/3/envelope/ (status_code=403 request=<WSGIRequest: POST '/api/3/envelope/'>)
web-1 | 08:37:05 [INFO] sentry.access.api: api.access (method='POST' view='django.views.generic.base.RedirectView' response=403 user_id='None' is_app='None' token_type='None' is_frontend_request='False' organization_id='None' auth_id='None' path='/api/3/envelope/' caller_ip='172.18.0.56' user_agent='sentry.python/2.13.0' rate_limited='False' rate_limit_category='None' request_duration_seconds=0.058341026306152344 rate_limit_type='DNE' concurrent_limit='None' concurrent_requests='None' reset_time='None' group='None' limit='None' remaining='None') It looks like there is the same Maybe also this relay-1 | 2024-09-11T08:37:05.875504Z DEBUG relay_server::envelope: failed to parse sampling context error=invalid type: floating point `1`, expected a string |
Thank you again for your time and ideas @aldy505 . However, I am still blocked on this issue, is there anything else that I can do or also provide you with more information so that maybe you can guide me further towards finding a solution? |
Self-Hosted Version
24.8.0
CPU Architecture
x86_64
Docker Version
27.2.0
Docker Compose Version
2.29.2
Steps to Reproduce
Clone the self-hosted Sentry repository.
Sentry configuration:
cat sentry/config.yml
:cat sentry/sentry.conf.py
:Caddy configuration
I use Caddy for setting up the reverse proxy by adding the
caddy
container in adocker-compose.override.yml
file.cat docker-compose.override.yml
:This is the
Caddyfile
available in the root directory of the cloned repo.cat Caddyfile
:Run
./install.sh
and then spin up docker containerdocker compose up
. All containers running then set up a Sentry python project onsentry.mydomain.com
. Get theDSN
and create the python script:Test connection using a simple python script:
cat test-self-hosted-sentry.py
:Expected Result
A new issue is reported under the Sentry python project at URL:
https://sentry.mydomain.com/organizations/sentry/projects/python/?project=3
.Actual Result
docker compose logs -f web
:Event ID
No response
The text was updated successfully, but these errors were encountered: