Filtering Duplicate logs #284
Unanswered
iamterryclark
asked this question in
Q&A
Replies: 1 comment 6 replies
-
The code looks fine, but the result depends on the other details. Could you share the whole code to help me reproduce that? |
Beta Was this translation helpful? Give feedback.
6 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Aim:
I would like to create a hook which skips duplicate logs that are created but prints one line with the number of skips. I am trying out a similar approach to SDPlogs handling of this. I am not able to get the desired result.
What I have done:
In the
filter
function I want to capture all duplicate logs within a 10 second span somax_skip_duration = 10000
. I make json objects from the data to compare them, as they could be arrays.I created another logger
dup_logger
which then logs to the same file instead of returning out. This results in all my duplicate log message logging out but my other logs do not get through unless i return message instead of false inside the filtered check.skip_counter
andlast_msg
are global value I have set.I would appreciate if someone could point me in the right direction for this to work properly?
Example Code:
Beta Was this translation helpful? Give feedback.
All reactions