Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NewBatchingHook doesn't seem to grab the sequence token #5

Open
ramonkeypr opened this issue Jun 13, 2017 · 4 comments
Open

NewBatchingHook doesn't seem to grab the sequence token #5

ramonkeypr opened this issue Jun 13, 2017 · 4 comments

Comments

@ramonkeypr
Copy link

I am attempting to use batching to avoid CloudWatch rate limiting. During my automated testing, I am seeing the following:

=== RUN   TestCloudWatchBatching100
Failed to fire hook: InvalidSequenceTokenException: The given sequenceToken is invalid. The next expected sequenceToken is: 49570318328963403756104352477615904793123053365436134642
	status code: 400, request id: f15938de-5063-11e7-8263-adbc854f83ad
{"app":"middlewares.test","host":"45f28166fed1","level":"info","msg":"test-host-100 - username [ts] \"method uri proto\" %!d(string=status) %!d(string=size) \"referer\" \"agent\" fblh.reqid \"frontend\" \"backend\" elapsed\n","time":1497377582}Failed to fire hook: InvalidSequenceTokenException: The given sequenceToken is invalid. The next expected sequenceToken is: 49570318328963403756104352477615904793123053365436134642
	status code: 400, request id: f15e8fab-5063-11e7-bf73-6fbae9798f07
{"app":"middlewares.test","host":"45f28166fed1","level":"info","msg":"test-host-100 - username [ts] \"method uri proto\" %!d(string=status) %!d(string=size) \"referer\" \"agent\" fblh.reqid \"frontend\" \"backend\" elapsed\n","time":1497377582}Failed to fire hook: InvalidSequenceTokenException: The given sequenceToken is invalid. The next expected sequenceToken is: 49570318328963403756104352477615904793123053365436134642
	status code: 400, request id: f16482f0-5063-11e7-acda-e388958e2b61
{"app":"middlewares.test","host":"45f28166fed1","level":"info","msg":"test-host-100 - username [ts] \"method uri proto\" %!d(string=status) %!d(string=size) \"referer\" \"agent\" fblh.reqid \"frontend\" \"backend\" elapsed\n","time":1497377582}Failed to fire hook: InvalidSequenceTokenException: The given sequenceToken is invalid. The next expected sequenceToken is: 49570318328963403756104352477615904793123053365436134642
	status code: 400, request id: f168c988-5063-11e7-abdd-139e182e19f7
{"app":"middlewares.test","host":"45f28166fed1","level":"info","msg":"test-host-100 - username [ts] \"method uri proto\" %!d(string=status) %!d(string=size) \"referer\" \"agent\" fblh.reqid \"frontend\" \"backend\" elapsed\n","time":1497377582}Failed to fire hook: InvalidSequenceTokenException: The given sequenceToken is invalid. The next expected sequenceToken is: 49570318328963403756104352477615904793123053365436134642

/snip

If I can find a fix, I will submit a PR.

@kdar
Copy link
Owner

kdar commented Jul 12, 2017

@punya-asapp would you know the issue here? I'm kind of swamped at the moment.

@punya-asapp
Copy link
Contributor

I'll take a look tomorrow

@SteveHNH
Copy link
Contributor

Not sure how relevant this is these days as this issue is pretty old, but I did a bunch of fiddling with the batching hook. It seems to work fine if it's the only process, but when you have multiple processes logging to the same logstream, you can end up in a state where the sequence token is incorrect and it never corrects itself.
The workaround is you either grab a new sequence token when it fails (retry), or you log to different log streams per process.
I have some retry logic I can put in a PR if it seems valuable, but I ended up just going with logging to a different stream per process as it was the most simple. The logs are aggregated in elasticsearch in our architecture so multiple streams doesn't really hurt.

@kdar
Copy link
Owner

kdar commented Oct 31, 2019

Any PR is appreciated if you believe it's valuable.

gillepsi added a commit to gillepsi/logrus-cloudwatchlogs that referenced this issue Feb 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants