Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test: against ES with configured TLSv1.3 #162

Merged
merged 16 commits into from
Mar 16, 2022
Merged

Conversation

kares
Copy link
Contributor

@kares kares commented Nov 29, 2021

This PR establishes confidence that TLSv1.3 is working and supported on recent LS (JVM) versions.

An integration target is added where ES 7.x is set to only accept TLS 1.3 protocol and gets tested against LS.

@kares kares changed the title CI: test with only TLSv1.3 enabled in ES Test: against ES with configured TLSv1.3 Feb 3, 2022
@kares kares marked this pull request as ready for review February 3, 2022 09:48
@kares kares mentioned this pull request Feb 3, 2022
40 tasks
@kares kares self-assigned this Feb 3, 2022
@kaisecheng kaisecheng self-requested a review March 15, 2022 13:35
Copy link
Contributor

@kaisecheng kaisecheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks good. Most of the updates are from filter-elasticsearch.
The following red CI seems different from the known issues
INTEGRATION=false ELASTIC_STACK_VERSION=7.x
INTEGRATION=false ELASTIC_STACK_VERSION=8.x

.ci/Dockerfile.elasticsearch Outdated Show resolved Hide resolved
@kares
Copy link
Contributor Author

kares commented Mar 16, 2022

The following red CI seems different from the known issues
INTEGRATION=false ELASTIC_STACK_VERSION=7.x
INTEGRATION=false ELASTIC_STACK_VERSION=8.x

Right, there's more than just the interruptible failure mentioned in elastic/logstash#13572
There seems to be a unit test leak but we do not have any production nor unit test related changes here.
I am also not reproducing the leak locally ... maybe it's due the same cause as the interruptible issue (elastic/logstash#13572) that we somehow end up executing much slower and the scheduler thread is still around while another example runs.

@kares
Copy link
Contributor Author

kares commented Mar 16, 2022

tried handling the extra spec failures in #169

Copy link
Contributor

@kaisecheng kaisecheng left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Red CI is unrelated to this change. All failed CI tests are traced in issues.

@kares kares merged commit cb9faac into logstash-plugins:main Mar 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants